I have a sql serevr database in a remote system .I am using ADO.net to bring that database table values into my local mysql server .
Now i want to fetch data from remote db into my local db every 15 min .I think this will involve running the program every 15 min.Please suggest if there is any better way .I want to make it automatic.
I tried with putting the package.dtsx in schedule task . It doesnt work .I think we can make an exe file and then run it
Can you tell me how do i make an exe file?
How do i do this.
Thanks
Since you've tagged your question with Visual Studio 2008, I'll assume you have that.
Create a new project of type Console Application. If you add your code to that project and compile it, the result will be a .exe file which you can run as a Scheduled task.
However, since you are talking about a .dtsx package, another option would be to add your .dtsx package as a scheduled job in the server you are trying to import data to.
Read this on how you schedule a job in SQL Server.
Related
I have an application which has an oracle database, so the installation of the application needs running some oracle commands script files to create the database and perform some DDL operations.
I was trying to prepare an installation wizard using C# forms application. This wizard needs to run these commands. My questions are: Is C# windows forms app a good choice to achieve that? And how to perform it? i.e. how to run oracle commands script files from inside the application? I exactly need a function that takes the file path as input parameter and executes the commands within the script files...
Thanks in advance
I've written a script in c# which I have to execute on a daily basis. It's something like an updater for my data base. So it pulls some data from the Active Directory using CSVDE, and saves that info into csv files in the same directory, then parses those files and updates my database if necessary. Obviously I don't want to run it manually every day, so instead I added a scheduled task. My problem is that it doesn't create the expected files, at least in the same path. If I run the script manually it works like a charm. Also after a scheduled run I tried to find those files by filename on the machine to check whether those were created in some other place, but no luck.
Do you have any idea why it not works the same way?
It was some issue with the credentials under the script ran. Solved now.
I am using SQL server 2008 R2 and VS2010. I made simple application by using this tools. I attach database as .mdf in my application and deploy that application on other machine its works fine. Now if I plan for new release of my app which some extended features, I can upload Code by DLL, But problem is updating .mdf file, to handle this I am exporting database into .xls sheets (Application have one utility to backup database) and then import into SQL Server to create new .mdf file. Someone have better solution on this? Can I open old version of .mdf file in SQL Server(Third party software) and Execute DML/DDL script on it to make latest code and database compatible ? May I keep .sql file in one of my project code and execute it by some utility..? Any Class in C# which can handle this..?
I did not get your query completely. Do you want to upgrade the DB through the application?
You can of course run .sql files through your application, but I'm not sure it would help you change the Database configuration.
Alternatively, if you already have the updated .mdf file and the database name is same, then you can follow the following steps.
1. Detach the database by SSMS in the third party environment through SSMS.
2. Replace the .mdf, .ldf and .ndf (if any) in the disk.
3. Attach the updated .mdf file.
This will get the new Object definitions as well as data.
As far as I'm aware, there is no process for merging .mdf files, because the SQL Server might not be able to identify the similar objects properly as sys tables may be different, and also would not know which data to keep in the final data base, in case the table structure, constraints or data conflicts occur.
However, looking at your requirement, the best way I can suggest is,
1. Generate the Alter scripts for the tables modified (By right clicking on the object name and using Script Table As.. option). Of course, I assume you have the list of objects modified and the modifications.
2. Connect the two DB servers over network and write an SSIS package or Import data from the old DB to the new one for the tables you want.
Hope this helps.
I have a WPF project set up to use a local SQL Server Compact database through an ADO.NET Entity Data Model in Visual Studio Express 2012 for Desktop. The project works great, on first run I can load all of the data, manipulate it as I please and come back later with the changed data still in place.
I noticed while doing a little restructuring to the schema that the data visible to VS was only the very first bits of data that I entered manually when creating the database and the next time I compiled all of the data I had added since was gone!
After some digging, I came to the conclusion that the compiled version of the app was using the SDF file sent to the bin/Debug folder by the file's Content:Copy If Newer build action. This means that there could be as many as 4 different copies of the database to be worried about: project folder, debug folder, release folder, and the deployed copy on the end user's PC.
I would like to have a single copy of the database on my dev machine that is accessed by both debug and release compiled versions and the database explorer in VS that is installed on the end user's PC by ClickOnce. I suppose I could change the connection string to an absolute path during development and hope I can remember to change it back to relative before I publish for deployment.
Finally, I foresee the need to release updates for this application as well and am worried that such an update would erase the end user's data if improperly done. If possible, I would like to be able to only update the schema of the end user's database without touching the data itself whenever I release an update. If this is not possible that is acceptable and I'll just have to make sure I put every structure I can think of into the database before my first deploy.
In summary my questions are the following:
How to share a single sql compact database between VS, debug, and release?
How to handle local database during application deployment and updates, with the optional ability to update the database schema without erasing the data?
I have a similar application and I keep the database file completely separate. Because you may also need to do updates that you don't want the user database overwritten. I have a process that checks the database schema before the EF connection takes place. So when my users install this application it requires they download the database file from my webserver and puts it in a specific location on their computer.
I have a SQL Server database, i need to export all of the data into an Access mdb that users can download. What's the simplest way of doing this from C#?
I realise I could have a blank (but with schema in place) mdb, and when i want to export I could copy it, then read all the data from SQL Server into the mdb via datasets, but that seems like a right faff. Is there an easier way?
Thanks
Could you create a DTS Package to do the export, then write a C# app to execute the DTS package, the user then just needs to execute your C# app? You could also set the DTS package to run automatically if you don't want user interaction.