I'm working on my very first Xamarin app. It's a GPS-based game, with 1 team searching for a certain person which is also using the app. The person "running" sends a GPS location every "X" minutes. As you guys understand this game has to work even when the "running" person has locked his/her phone or has put it in their pocket.
Is it possible to keep this app running under these conditions? Can i keep using location services, timers and database connections? I've searched for this on the internet already, and people where saying this was impossible due to security reasons, but couldn't find any proper answer for this specific question.
Can somebody please tell me if this is possible and how to start with it? Many thanks in advance!
The heavyweight solution would be a Service. If you wanted updates at intervals of seconds, this might be the best solution. But since you said minutes, it's probably better to use an alarm. This will fire more or less on schedule even if the framework decides to kill your app while it's in the background. (See the section of the docs about precision.)
Xamarin AlarmManager
The possible drawback of an alarm is that even if GPS is already up and running on the device, it may take a couple seconds after the alarm fires to receive a location update. Since you mentioned that the user can decrease the interval to seconds, you might want to try both solutions and see which works best. One may be clearly better for your application, or there may be a trade off between reliability and battery usage.
Related
I have about 10.000 jobs that I want to be handled by approx 100 threads. Once a thread finished, the free 'slot' should get a new job untill there are no more jobs available.
Side note: processor load is not an issue, these jobs are mostly waiting for results or (socket) timeouts. And the amount of 100 is something that I am going to play with to find an optimum. Each job will take between 2 seconds and 5 minutes. So I want to assign new jobs to free threads and not pre-assign all jobs to threads.
My problem is that I am not sure how to do this. Im primarily using Visual Basic .Net (but C# is also ok).
I tried to make an array of threads but since each job/thread also returns a value (it also takes 2 input vars), I used 'withevents' and found out that you cannot do that on an array... maybe a collection would work? But I also need a way to manage the threads and feed them new jobs... And all results should go back to the main-form (thread)...
I have it all running in one thread, but now I want to speed up.
And then I though: Actually this is a rather common problem. There is a bunch of work to be done that needs to be distributed over an amount of worker threads.... So thats why I am asking. Whats the most common solution here?
I tried to make it question as generic as possible, so lots of people with the same kind of problem can be helped with your reply. Thanks!
Edit:
What I want to do in more detail is the following. I currently have about 1200 connected sensors that I want to read from via sockets. First thing I want to know is if the device is online (can connect on ip:port) or not. After it connects it will be depending on the device type. The device type is known after connect and Some devices I just read back a sensor value. Other devices need calibration to be performed, taking up to 5 minutes with mostly wait times and some reading/setting of values. All via the socket. Some even have FTP that I need to download a file from, but that I do via socket to.
My problem: Lot's of waiting time, so lot's of possibility to do things paralel and speed it up hugely.
My starting point is a list of ip:port addresses and I want to end up with a file with that shows the results and the results are also shown on a textbox on the main form (next to a start/pause/stop button)
This was very helpfull:
Multi Threading with Return value : vb.net
It explains the concept of a BackgroundWorker which takes away a lot of the hassle. I am now trying to see where it will bring me.
EDIT: Context
I have to develop a web asp.net application which will allow user to create "Conversion request" for one or several CAO files.
My application should just upload files to a directory where I can convert them.
I then want to have a service that will check the database updated by the web application to see if a conversion is waiting to be done. Then I have to call a batch command on this file with some arguments.
The thing is that those conversion can freeze if the user give a CAO file which has been done wrongly. In this case, we want the user or an admin to be able to cancel the conversion process.
The batch command use to convert is a third party tool that I can't change. It need a token to convert, and I can multithread as long as I have more than one token available. An other application can use those token at the same moment so I can't just have a pre-sized pool of thread according to the number of token I have.
The only way I have to know if I can convert is to start the conversion with the command and see if in the logs it tells me that I have a licence problem which mean either "No token available" or "Current licence doesn't accept the given input format". As I allow only available input formats on the web application, I can't have the second problem.
The web application is almost done, I mean that I can upload file and download results and conversion logs at the end. Now I need to do the service that will take input files, convert them, update convert status in database and lastly add those files in the correct download dirrectory.
I have to work on a service which will look in a database at a high frequency (maybe 5 or 10 seconds) if a row is set as "Ready to convert" or "Must be canceled".
If the row is set to "ready to convert" I must try to convert it, but I do it using a third party dll that have a licence token system that allow me to do only 5 converts simultaneously atm.
If the row is set to "Must be canceled" I must kill the conversion because it's either freeze and the admin had to kill it or because the user canceled his own task.
Also, conversion times can be very long, from 1 or 2 seconds to several hours depending on the file size and how it has been created.
I was thinking about a pooling system, as I saw here :
Stackoverflow answer
Pooling system give me the advantage to isolate the reading database part to the conversion process. But I have the feeling that I loose a kind of control on background process. Which is maybe juste because I'm not used to them.
But I'm not very used to services and even if this pool system seems good, I don't know how I can cancel any task if needed ?
The tool I use to convert work as a simple batch command that will just return me an error if no licence are available now, but using a pool how can I make the convert thread wait for the convert to be done if No licence are available is a simple infinite while loop an appropriate answer ? It seems quite bad to me.
Finally, I can't just use a "5 threads pool" as thoses licences are also used by 2 others applications which doesn't let me know at any time how many of them are available.
The idea of using pool can also be wrong, as I said, I'm not very used to services and before starting something stupid, I prefer ask abotu the good way to do it.
Moreover, about the database reading/writing, even if I think that the second option is better, should I:
Use big models files that I already use on my ASP.NET web application which will create a lot of objects (one for each row as it's entities models).
Don't use entities models but lighter models which will be less object entities oriented but will probably be less ressources demanding. This will also be harder to maintain.
So I'm more asking about some advices on how I should do it than a pure code answer, but some example could be very useful.
EDIT: to be more precise, I need to find a way to:
(For the moment, I stay with only one licence available, I will evolve it later if needed)
Have a service that run as a loop and will if possible start a new thread for the given request. The service must still be running as the status can be set to "Require to be cancel".
At the moment I was thinking about a task with a cancellation token. Which would probably achive this.
But if the task find that the token isn't currently available, how can I say to my main loop of the service that it can't process now ? I was thinking about having just an integer value as a return where the return code would be an indicator on the ending reason: Cancellation / No token / Ended... Is that a good way to do ?
What I'm hearing is that the biggest bottleneck in your process is the conversion... pooling / object mapping / direct sql doesn't sound as important as the conversion bottleneck.
There are several ways to solve this depending on your environment and what your constraints are. Good, fast, and cheap... pick 2.
As far as "best practice" go, there are large scale solutions (ESBs, Message Queue based etc), there are small scale solutions (console apps, batch files, powershell scripts on Windows scheduler, etc) and the in-between solutions (this is typically where the "Good Enough" answer is found). The volume of the stuff you need to process should decide which one is the best fit for you.
Regardless of which you choose above...
My first step will be to eliminate variables...
How much volume will you be processing? If you wrote something that's not optimized but works, will that be enough to process your current volume? (e.g. a console app to be run using the Windows Task Scheduler every 10 - 15 seconds and gets killed if it runs for more than say 15 minutes)
Does the third party dll support multi-threading? If no, that eliminates all your multi-threading related questions and narrows down your options on how to scale out.
You can then determine what approach will fit your problem domain...
will it be the same service deployed on several machines, each hitting the database every 10-15 seconds?, or
will it be one service on one machine using multi-threading?
will it be something else (pooling might or might not be in play)?
Once you get the answer above, the next question is.
will the work required fit within the allocated budget and time constraint of your project? if not, go back to the questions above and see if there questions above that you can answer differently that would change the answer to this question to yes.
I hope that these questions help you get to a good answer.
I am basically creating a site for recruiters. One of the functionality in my application requires posting to Facebook periodically. The posting frequency can be from 0(Never) to 4(High)
For Eg. If a recruiter has 4 open jobs and he has posting frequency set to 4, each job should be posted as per it's turn: 1st job on 1st day, 2nd job on 2nd, 3rd job on 3rd etc, on 5th day again 1st job (round robin fashion).
Had he set the posting frequency to 2, two jobs would be posted daily (thus each job would be posted every 2 days)
My only question is what type of threading should I create for this since this is all dynamic!! Also, any guidelines on what type of information should I store in database?
I need just a general strategy to solve this problem. No code..
I think you need to seperate it from your website, I mean its better to run the logic for posting jobs in a service hosted on IIS ( I am not sure such a thing exists or not, but I guess there is).
Also you need to have table for job queue to remember which jobs need to be posted, then your service would pick them up and post them one by one.
To decide if this is the time for posting a job you can define a timer with a configurable interval to check if there is any job to post or not.
Make sure that you keep the verbose log details if posting fails. It is important because it is possible that Facebook changes its API or your API key becomes invalid or anything else then you need to know what happened.
Also I strongly suggest to have a webpage for reporting the status of jobs-to-post queue, if they failed what was the causes of problem.
If you program runs non-stop, you can just use one of the Timer classes available in .NET framework, without the need to go for full-blown concurrency (e.g. via Task Parallel Library).
I suspect, though, that you'll need more than that - some kind of mechanism to detect which jobs were successfully posted and which were "missed" due program not running (or network problems etc.), so they can be posted the next time the program is started (or network becomes available). A small local database (such as SQLite or MS SQL Server Compact) should serve this purpose nicely.
If the requirements are as simple as you described, then I wouldn't use threading at all. It wouldn't even need to be a long-running app. I'd create a simple app that would just try to post a job and then exit immediately. However, I would scheduled it to run once every given period (via Windows Task Scheduler).
This app would check first if it hasn't posted any job yet for the given posting frequency. Maybe put a "Last-Successful-Post-Time" setting in your datastore. If it's allowed to post, the app would just query the highest priority job and then post it to Facebook. Once it successfully posts to Facebook, that job would then be downgraded to the lowest priority.
The job priority could just be a simple integer column in your data store. Lower values mean higher priorities.
Edit:
I guess what I'm suggesting is if you have clear boundaries in your requirements, I would suggest breaking your project into multiple applications. This way there is a separation of concerns. You wouldn't then need to worry how to spawn your Facebook notification process inside your web site code.
I feel my question is somewhat straight forward, but I've added some details of my problem in the "Background Info" section in case it is too vague.
Question
How does the WorkflowServiceHost determine that a persisted activity, that is Idle due to Delay, has reached the Delay timeout? Does it load it into memory and check every so often, or is there something else happening here?
Background Info
So, I'm a bit new to workflow and I'm trying to determine the feasibility of using it for a business process that would use a 3 month delay. Basically, the business process is to allow a customer a 3 month trial of upgraded service. To accomplish this, I'm thinking of implementing a Pick activity that splits between a WCF Receive ("Cancel upgrade receive") and a 3 month Delay activity. If the delay expires, they are upgraded permanently and billing is updated. Alternatively, if the cancel is initialized then, well... yeah, you get the idea :)
So.... my concern is in regards to how Delay is implemented when using SQL workflow persistence. I don't want to end up w/ 500 activities in the persistence store that have to be loaded every 10 minutes for 3 months in order to check if the Delay activity expired.
According to https://msdn.microsoft.com/en-us/library/ee829473(v=vs.110).aspx, the way this works is that:
The SQL Workflow Instance Store runs an internal task that periodically wakes up and detects runnable or activatable workflow instances in the persistence database
An activatable workflow is a workflow that meets either of the following criteria:
The instance is unlocked and has a pending timer that has expired.
The instance has an expired lock on it.
The instance is unlocked and its status is Executing.
So the suggestion of the OP (" Does it load it into memory and check every so often") is correct.
I have not confirmed this behavior myself, but stumbled upon this thread as I was looking for an answer to the exact same question, then found the relevant MSDN article, which I wanted to share with you.
Hope this helps.
I was looking at this same problem with a much smaller window of time. I still would love to hear if/how you resolved this using workflow?
I am using WF4 and a pick as you stated above and then my solution is to try and use appfabric for reinitialising the workflows based on the delayed timer. This is based on what I read here:Hosting workflow services with durable timers / delays and this here: Activation of workflow service instances
I have tested that the pick works perfectly with a delay on one side, but now I have to test appfabric out.
Perhaps, I will come back with an update here on how it works out. Or you can give me some insight?
I have to create an app that will read in some info from a db, process the data, write changes back to the db, and then send an email with these changes to some users or groups. I will be writing this in c#, and this process must be run once a week at a particular time. This will be running on a Windows 2008 Server.
In the past, I would always go the route of creating a windows service with a timer and setting the time/day for it to be run in the app.config file so that it can be changed and only have to be restarted to catch the update.
Recently, though, I have seen blog posts and such that recommend writing a console application and then using a scheduled task to execute it.
I have read many posts talking to this very issue, but have not seen a definitive answer about which process is better.
What do any of you think?
Thanks for any thoughts.
If it is a one per week application, why waste the resources for it to be running in the background for the rest of the week.
A console application seems much more appropriate.
The typical rule of thumb that I use is something along these lines. First I ask a few questions.
Frequency of Execution
Frequency of changes to #1
Triggering Mechanism
Basically from here if the frequency of execution is daily or less frequent I'll almost always lean towards a scheduled task. Then looking at the frequency for changes, if there is a high demand for schedule changes, I'll also try to lean towards scheduled tasks, to allow no-coding changes for schedule changes. lastly if there is ever a thought of a trigger other than time, then I'll lean towards windows services to help "future proof" an application. Say for example the requirement changes to be run every time a user drops a file in X folder.
The basic rule I follow is: if you need to be running continuously because events of interest can happen at any time, use a service (or daemon in UNIX).
If you just want to periodically do something, use a scheduled task (or cron).
The clincher here is your phrase "must be run once a week at a particular time" - go for a scheduled task.
If you have only one application and you need it to run once a week may be scheduler will be good as there is no need to have separate service and process running on the system which will be idle most of the time.