Sqlite DB querying takes a long time WP8 - c#

I am building a Windows Phone 8 app using sqlite.net using this link as a reference:-
http://developer.nokia.com/community/wiki/How_to_use_SQLite_in_Windows_Phone
There is a database in the project which is being seeded in the Isolated Storage. The database contains only one table which has almost 26k entries.
I am trying to connect to that database in my MainPage.xaml.cs as follows:-
protected override void OnNavigatedTo(System.Windows.Navigation.NavigationEventArgs e)
{
base.OnNavigatedTo(e);
using (SQLiteConnection db = new SQLiteConnection(App._dbPath))
{
db.GetTableInfo("IWMCemeteries");
try
{
List<IWMCemeteries> cemeteriesList = db.Table<IWMCemeteries>().ToList<IWMCemeteries>();
MessageBox.Show("Number of elements in table is " + cemeteriesList.Count);
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
The problem is that it takes too long(over 25 seconds) for the message dialog to show up.
I tried an alternate method running a raw query as follows:-
List<IWMCemeteries> cemeteries = db.Query<IWMCemeteries>("select * from IWMCemeteries");
MessageBox.Show("Number of elements in list is " + cemeteries.Count);
But this seems to take even longer!(almost 30s).
Can someone please tell me what I am doing wrong here?
Thanks,
Rajeev

Nothing wrong here for me. As some people noticed, with 26k rows you are starting to work with an interesting bulk of data. So, in mobile devices working with a "lite" database, you must adapt your request depending on what you really need :
You want the number of rows, then use SELECT COUNT(*)
You want to display all rows in a list, then use paging or asynchronous loading (on scroll down) to fetch only 20 elements each times.
In any apps, but mostly in mobile devices, you have to consider the volume of data which moves.
In this way, any request would be instant and your application will perform well.

There's nothing wrong with your query. Just limit the data you fetch from the database. It's a mobile device with limited power, not a full blown pc.

Related

Increase SQL Bulk Copy speed on Azure

I am working on a project where I have to move an on-premises application over to Azure. We have an upload utility that transfers about 150,000 records to the web app (MVC App). Unfortunately, I was getting timeout issues after I migrated to Azure. I made several changes including using SqlBulkCopy and Store Procedures instead of using SqlCommand. Now, the timeout issue has been resolve but the data upload is taking about 5mins to upload the 150,000 records into a table on Azure.
I am using a trial version on Azure, and my Database DTU is 20. Now, I would love to keep it at 20 because of the cost. I have a small budget That I am working with. Note, Database Size isnt a problem. I am well below the quota.
Any Suggestions on how I can decrease the time to insert those 150,000 records?
Code Sample
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection))
{
bulkCopy.BulkCopyTimeout = 0;
bulkCopy.BatchSize = 10000;
bulkCopy.ColumnMappings.Add("Barcode", "Barcode");
bulkCopy.ColumnMappings.Add("SubCategory", "SubCategory");
bulkCopy.ColumnMappings.Add("ItemDescription", "ItemDescription");
bulkCopy.ColumnMappings.Add("CreateDate", "CreateDate");
bulkCopy.ColumnMappings.Add("RevisedDate", "RevisedDate");
bulkCopy.DestinationTableName = "Items";
try
{
bulkCopy.WriteToServer(dtTblData);
destinationConnection.Close();
}
catch (Exception ex)
{
this.Logs.Add(DateTime.Now.ToString() + ": " + ex.Message);
}
}
}
FYI: During the insert operation the DTU for my database reaches 100%.
Using the option SqlBulkCopyOptions.TableLock will increase the performance.
So if you can lock the table, you should without a doubt use it.
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection, SqlBulkCopyOptions.TableLock))
{
// ...code...
}
Outside of this configuration, there is not a lot of stuff you can do since you already use SqlBulkCopy. The bottle neck is your database performance that you cannot upgrade because of the budget.
Besides the Table locking Jonathan mentioned, the only real way to increase performance is to increase the DTUs for the service.
However you don't need to leave the database on the higher setting forever, if this bulk load is a infrequent operation you could temporary raise the DTUs of the database, do your load, then lower the DTUs back down. You would only be billed at the higher rate for the time you where actually uploading.
You can change the database via code using the the Azure SDK and functions in the Microsoft.Azure.Management.Sql.DatabasesOperationsExtensions class and setting the RequestedServiceObjectiveId value with a higher tier objective (The 20 DTUs you are on now is a S1 objective, you could move up to a S2 (50 DTUs) during the bulk load) on the Database object you pass in to the update function.

SQL Server CE two way sync with remote Access database

I'm working on a pretty special, legacy project where I need to build an app for PDA devices under Windows Mobile 6.5. The devices have a local database (SQL Server CE) which we are supposed to sync with a remote database (Microsoft Access) whenever they are docked and have network access.
So the local database using SQL Server CE works fine, but I can’t figure out a way to sync it to the Access database properly.
I read that ODBC and OLEDB are unsupported under Windows Mobile 6.5, most ressources I find are obsolete or have empty links, and the only way I found was to export the local database relevant tables in XML in the hope to build a VBA component for Access to import them properly. (and figure out backwards sync).
Update on the project and new questions
First of all, thanks to everyone who provided an useful answer, and to #josef who saved me a lot of time with the auto path on this thread.
So a remote SQL Server is a no go for security reasons (client is paranoid about security and won't provide me a server). So I'm tied to SQL Server CE on the PDA and Access on the computer.
As for the sync:
The exportation is fine: I'm using multiple dataAdapters and a WriteXML method to generate XML files transmitted by FTP when the device is plugged back in. Those files are then automatically imported into the Access database. (see code at the end).
My problem is on the importation: I can acquire data through XML readers from an Access-generated file. This data is then inserted in a dataset (In fact, I can even print the data on the PDA screen) but I can't figure out a way to do an "UPSERT" on the PDA's database. So I need a creative way to update/insert the data to the tables if they already contains data with the same id.
I tried two methods, with SQL errors (from what I understood it's SQL Server CE doesn't handle stored procedures or T-SQL). Example with a simple query that is supposed to update the "available" flag of some storage spots:
try
{
SqlCeDataAdapter dataAdapter = new SqlCeDataAdapter();
DataSet xmlDataSet = new DataSet();
xmlDataSet.ReadXml(localPath +#"\import.xml");
dataGrid1.DataSource = xmlDataSet.Tables[1];
_conn.Open();
int i = 0;
for (i = 0; i <= xmlDataSet.Tables[1].Rows.Count - 1; i++)
{
spot = xmlDataSet.Tables[1].Rows[i].ItemArray[0].ToString();
is_available = Convert.ToBoolean(xmlDataSet.Tables[1].Rows[i].ItemArray[1]);
SqlCeCommand importSpotCmd = new SqlCeCommand(#"
IF EXISTS (SELECT spot FROM spots WHERE spot=#spot)
BEGIN
UPDATE spots SET available=#available
END
ELSE
BEGIN
INSERT INTO spots(spot, available)
VALUES(#spot, #available)
END", _conn);
importSpotCmd.Parameters.Add("#spot", spot);
importSpotCmd.Parameters.Add("#available", is_available);
dataAdapter.InsertCommand = importSpotCmd;
dataAdapter.InsertCommand.ExecuteNonQuery();
}
_conn.Close();
}
catch (SqlCeException sql_ex)
{
MessageBox.Show("SQL database error: " + sql_ex.Message);
}
I also tried this query, same problem SQL server ce apparently don't handle ON DUPLICATE KEY (I think it's MySQL specific).
INSERT INTO spots (spot, available)
VALUES(#spot, #available)
ON DUPLICATE KEY UPDATE spots SET available=#available
The code of the export method, fixed so it works fine but still relevant for anybody who wants to know:
private void exportBtn_Click(object sender, EventArgs e)
{
const string sqlQuery = "SELECT * FROM storage";
const string sqlQuery2 = "SELECT * FROM spots";
string autoPath = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().GetName().CodeBase); //get the current execution directory
using (SqlCeConnection _conn = new SqlCeConnection(_connString))
{
try
{
SqlCeDataAdapter dataAdapter1 = new SqlCeDataAdapter(sqlQuery, _conn);
SqlCeDataAdapter dataAdapter2 = new SqlCeDataAdapter(sqlQuery2, _conn);
_conn.Open();
DataSet ds = new DataSet("SQLExport");
dataAdapter1.Fill(ds, "stock");
dataAdapter2.Fill(ds, "spots");
ds.WriteXml(autoPath + #"\export.xml");
}
catch (SqlCeException sql_ex)
{
MessageBox.Show("SQL database error: " + sql_ex.Message);
}
}
}
As Access is more or less a stand-alone DB solution I strongly recommend to go with a full flavored SQL Server plus IIS to setup a Merge Replication synchronisation between the SQL CE data and the SQL Server data.
This is described with full sample code and setup in the book "Programming the .Net Compact Framework" by Paul Yao and David Durant (chapter 8, Synchronizing Mobile Data).
For a working sync, all changes to defined tables and data on the server and the CE device must be tracked (done via GUIDs, unique numbers) with there timestamps and a conflict handling has to be defined.
If the data is never changed by other means on the server, you may simply track Device side changes only and then push them to the Access database. This could be done by another app that does Buld Updates like described here.
If you do not want to go the expensive way to SQL Server, there are cheaper solutions with free SQLite (available for CE and Compact Framework too) and a commercial Sync tool for SQLite to MSAccess like DBSync.
If you are experienced, you may create your own SQLite to MS ACCESS sync tool.

Pushing DataBase to PDA results in empty database with RAPI2

I have a WorkAboutPro 4 and on this i run an application.
This application uses an SQLlite database.
Now i also run a computer program along side it, in here i use RAPI2 to work with my handheld device.
As soon i connect my device a function triggers and it first pulls my database from my handheld to my computer.
I then do some stuff with it and go on to push it back. problem is that the database that returns is always 0kb and has no data, not even tables.
//Create my Paths
string myDevice = device.GetFolderPath(SpecialFolder.ProgramFiles);
string deviceFile = myDevice + #"\PPPM\SQL_PPPM.s3db";
string myComputer = Path.GetDirectoryName(Assembly.GetExecutingAssembly().GetName().CodeBase).Substring(6);
string localFile = myComputer + #"\SQL_PPPM.s3db";
//Get the Database
RemoteFile.CopyFileFromDevice(device, deviceFile, localFile, true);
//Do Stuff
try
....
catch(Exception ex)
....
//Push The Database back
RemoteFile.CopyFileToDevice(device, localFile, deviceFile, true);
Now first i thought it was beccause i am not able to push a database over the connection. So i tried to pull a full database to my computer. but this works just fine.
Then i went on and placed an empty Txt file on my hadn held. pulled it, added text and pushed it and also this works fine.
So the only thing that goes wrong is when i try i push a full database back to my HandHeld resulting in an 0kb empty database.
Anyone an idea why this is?
Lotus~
Edit: if you know a beter way to detect if a device is connected and push/pull files from a PDA please let me know.
So i faced the same problem as well.
It most likely has to do with your Sqlite connections still being open.
The thing that worked out best for me was to modify my SqlDataAccess class and start using the 'using'.
Example:
using(var connection = new SQLiteConnection(ConnectionString))
{
try
{
connection.Open();
}
catch (Exception e)
{
throw new Exception(e.Message);
}
finally
{
connection.Close();
}
}
For me the same worked with DataAdapters.
Good luck in the future, and remember never throw yourself in a case of Denvercoder9. You should have answered this a long time ago.

How do i reset the system cache of WLAN info?

I am trying to write a WLAN fingerprinting program using NativeWifi in C#. To do this i run a loop to get the wlan information many times and then later use matlab to average / analyze the data.
The problem is that i get all the same values, even as i move about the house, when the program is running. From the internet i've seen that there is a cache that stores the data of available networks. I was wondering if there is a system call which resets this cache.
I have also seen this using the cmd call
netsh wlan show networks mode=bssid
this gives me the same values until i open the available wifi networks in my OS and if i run it again after, it will give different values.
edit: This system will be for my use only, so i would be comfortable starting over on a linux platform if there is a known library that can handle this for me. I don't even know what to google to even get the information, though. Anything related to "network cache" takes me to help threads of unrelated topics...
I will provide the relevant part of my code below:
public void get_info_instance(StreamWriter file)
{
try
{
foreach (WlanClient.WlanInterface wlanIface in client.Interfaces)
{
Wlan.WlanBssEntry[] wlanBssEntries = wlanIface.GetNetworkBssList();
foreach (Wlan.WlanBssEntry network in wlanBssEntries)
{
int rss = network.rssi;
byte[] macAddr = network.dot11Bssid;
string tMac = "";
for (int i = 0; i < macAddr.Length; i++)
{
tMac += macAddr[i].ToString("x2").PadLeft(2, '0').ToUpper();
}
file.WriteLine("Found network: " + System.Text.ASCIIEncoding.ASCII.GetString(network.dot11Ssid.SSID).ToString());
file.WriteLine("Signal: " + network.linkQuality + "%");
file.WriteLine("BSS Type: " + network.dot11BssType + ".");
file.WriteLine("RSSID: " + rss.ToString());
file.WriteLine("BSSID: " + tMac);
file.WriteLine(" ");
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
Internally, netsh is powered by this API. What this means, is that calling netsh wlan show networks mode=bssid just returns the cache of the networks that showed up during the last scan. This is what you've discovered.
This means that in order to refresh this cache, you need to trigger a scan. If your C# library you are using includes it, you could make this happen on demand with a call to WlanScan. I am not sure which C# wrapper you are using, but it probably includes this function. When you get a scan complete notification (register with source WLAN_NOTIFICATION_SOURCE_ACM and look out for wlan_notification_acm_scan_list_refresh), the cache should be updated.
If you let me know which C# library you are using, maybe I can point you to the relevant functions.
You mentioned that opening the available networks causes the cache to refresh. This is because opening the available networks triggers a call to WlanScan.
Profiles are not relevant to the available network list -- profiles are what the Wlan service uses to keep track of which networks are configured on your machine -- deleting them does not make WlanSvc scan again. It may be a coincidence that deleting them happens to coincide with a scan, but it is more of a side effect than the designed usage.
edit: to subscribe to notifications using the Managed Wifi API you are using, this snippet should work:
wlanIface.WlanNotification += wlanIface_WlanNotification;
And the callback:
static void wlanIface_WlanNotification(Wlan.WlanNotificationData notifyData)
{
if (notifyData.notificationCode == (int)Wlan.WlanNotificationCodeAcm.ScanComplete)
{
Console.WriteLine("Scan Complete!");
}
}
You can test this by running this, then opening the available networks on Windows. You should see "Scan Complete" shortly after you open it each time. You can use a messagebox instead of Console.WriteLine if you prefer.
To trigger a scan yourself:
wlanIface.Scan();
for all
netsh wlan delete profile name=* i=*
you might not want to do it on all interfaces, and hard code the interface in there for faster result

How might I set up data plumbing for Silverlight to MySQL in my situation?

In short: What is a good method for setting up read-only data access from Silverlight to a MySQL database?
Here are the details of my situation:
I'm currently trying to set up a Silverlight application to present data from a MySQL database. Currently, I need to set-up read-only access to the MySQL database (I may set up other tables for complete CRUD functionality at a later, date, but for these particular tables, I'm only ever going to be concerned with the retrieve aspect).
I tried setting it up using RIA Services (CTP July 2009) with Entity Framework, but I had trouble debugging it and ended up trying to recompile the source code from the MySQL ADO.NET connector in order to install custom DLLs into the GAC. I wasn't able to get any of this stuff to work correctly.
My problem was that I had date values stored as 0000-00-00 in lots of my MySQL tables. The MySQL ADO.NET Connector throws an exception everytime it tries to bring down a row with an invalid date in it. I would try to recompile the connector (see links above), but that's feeling very much like a hack. I would try to update the values in the MySQL database to be within the appropriate spec for dates, but our IT manager (and effectively our DBA) does not want to do it.
I don't mind learning to work with LINQ (LINQ-to-what?), but I want to avoid concatenating my own strings of SQL commands. Because of the Date restrictions, I need a way to specify Case When orders.OrderDate = '0000-00-00' Then '0001-01-01' Else orders.OrderDate End for pretty much every date instance.
I'm especially interested to hear from folks who have worked with .NET and MySQL together. What will work in my situation?
Why has no one suggested using a ORM to hide the mySQL details? Both NHibernate and Subsonic support mySQL. Both are very customisable in how they interact with the database and should allow you to cater for malformed dates.
By using an ORM your data objects are now POCOs, and you can use whatever you want to get the data to the Silverlight client. Vanilla web services or WCF should be fine. RIA services if you want to try out the bleeding edge.
IMHO, this will be simpler than setting up a mysql->php->xml->asp.net->silverlight chain.
My problem was that I had date values stored as 0000-00-00 in lots of my MySQL tables.
Can you just write Select NullIf( SomeDate, '0000-00-00') As SomeDate From SomeTable in your SQL queries? I don't know MySQL, but that's what I would do in T-SQL.
Here is what I did for a similar problem I was facing.
I used php to get data from the MySQL database and turned it into an XML file. I called that file from my silverlight app and used LINQtoXML to parse the data and make it available in my XAML controls. I am not a programmer by trade so maybe there is a better way to do it but this works for my app. Hope this helps. LINQ ROCKS!
Here is a portion of the code:
< ?php
header("Content-type: text/xml");
$grb_hostname = "host";
$grb_database = "dbName";
$grb_username = "dbUser";
$grb_password = "dbPwd";
$grb = mysql_connect($grb_hostname, $grb_username, $grb_password);
mysql_select_db($grb_database, $grb);
$results = mysql_query("SELECT * FROM bursts ORDER BY bursts.id DESC");
$xmlOutput = "<?xml version=\"1.0\"?>\n";
$xmlOutput .= "<grbs>\n";
while($row = mysql_fetch_array($results)) {
$xmlOutput .= "\t<grb id=\"".$row['id']."\" trigger=\"".$row['trigger']."\">\n";
$xmlOutput .= "\t\t<grb_id>".$row['grb_id']."</grb_id>\n";
$xmlOutput .= "\t\t<burst_ra>".$row['burst_ra']."</burst_ra>\n";
$xmlOutput .= "\t\t<burst_dec>".$row['burst_dec']."</burst_dec>\n";
$xmlOutput .= "\t</grb>\n";
}
$xmlOutput .= " < /grbs>"; // no space before /
echo $xmlOutput;
?>
then in my Silverlight I have the following:
private void LoadGrbs()
{
WebClient grbXmlFile = new WebClient();
// Make sure the crossdomainpolicy.xml file exists on the remote server.
grbXmlFile.DownloadStringAsync(new Uri("url_xml_generating_php_file", UriKind.Absolute));
grbXmlFile.DownloadStringCompleted += new DownloadStringCompletedEventHandler(grbsXmlLoaded);
}
private void grbsXmlLoaded(object sender, DownloadStringCompletedEventArgs e)
{
processGrbXml(e.Result);
}
private void processGrbXml(string grbData)
{
XDocument grbs = XDocument.Parse(grbData);
var query = from g in grbs.Descendants("grb")
select new
{
grbId = (string)g.Element("grb_id"),
grbDec = (string)g.Element("burst_dec")
};
foreach (var grb in query)
{
grbListbox.Items.Add(grb.grbId);
}
}
grbListbox is a Listbox control in my Silverlight app.
You should use RIA Services, the newest version came out last week, and it's included in the silverlight 4 beta now.
http://silverlight.net/getstarted/silverlight-4-beta/
You don't have to use the entity framework with RIA, there are other options. We do, but we use SQL Server so that might not be your favorite.
They have changed the errors some in the new RIA Stuff, so I'd recommend taking a 2nd look. Here's Brad Abrams' example from last week:
http://microsoftpdc.com/Sessions/CL21
Finally, if you're having a lot of trouble debugging, you could take a look at Fiddler. It's a program that watches the traffic and it can display you the errors you're having in a more obvious fashion.

Categories