How to map elements in XML files to columns in SQL table? - c#

I have multiple XML files whose data needs to be stored in one table in database. All the files should be mapped to one table.
But the elements names in XML files are different from the column names in the table. How to map the XML elements(from different files) to columns in the SQL table? How to perform this mapping and store the data in required columns? What is the optimal way of doing this task in C# when you have many XML files?
Also to note that every XML file will have different element names and format.

Related

SSIS Package how to import CSV lines selectively based on query

I have multiple CSV files with a 100 or so columns each that I want to import into an SQL database using an SSIS package. These CSV files are received every night, and I want our SQL tables to function as a history/track changes table.
In other words I'm required to evaluate each line of the CSV prior to import based on a unique identifier. I need to check the latest (based on date of import) entry of an ID in the table, and if it exists & differs from the new line in the CSV, it should be imported. If it's a duplicate, it should be ignored. if it doesn't exist at all, it should also be imported.
I can't simply filter out all duplicates, as a change from X to Y and then back to X should be recorded in the table (as it's a history/change table).
Originally I tried to get this working with a flat file import -> Lookup tool -> DB destination, but it doesn't look as though I can modify the lookup tool to use a specific query as opposed to just comparing the indicated columns with a DB to see if it exists. Is there a way to achieve this using the provided SSIS tools? The only alternative that I can see is creating a custom script task to evaluate each line in the CSV beforehand and writing it to a temp table or a new CSV with only the data that needs to be inserted. This might be a perfectly viable solution but I'm afraid of performance issues.

Upload CSV File then Mapped to Tables in Database

How can we upload a CSV file through web(ASP.NET MVC C#) and mapped the column in CSV table to our tables in database
Example:
CSV File:
Username,Address,Roles,Date
How to add all the value in 'Username' column to User Table, Name Column ?
Value in 'Address' column to AddrDet Table, Address Column?
Value in 'Roles' column to RolesDet Table, Roles Column?
AND choose the CSV column to be added to database? (So not all column in CSV will be taken)
using ASP.NET MVC C#
Because all I know is when the CSV uploaded, it will create DataTable specially for CSV and all the column in CSV will be uploaded
Thank You
I'm using MVC and EF DB FIRST
This questions is being marked as duplicate of Upload CSV file to SQL server
I don't feel (& don't think) that the question is related to or has completely same topic, so I'm answering this. I have myself marked question as too broad, as there is too much to explain.
Also I will add some links to the question, however they are not here to fill the answer, only to give OP an idea what question/topics to look for himself.
Short explanation:
Usually when You want to import data (CSV file) into database, You already got structure & schema of data (and Database). There is existing TableA and TableB, where exist some columns inside. If You want to dynamically create new columns/update schema of DB based on CSV file, this is an uneasy work (normally is not happening).
C#/ASP/.NET application is working in a way where You give it an input (from users' clicks, data load, task scheduler passed some time checkpoint) and the APP do the work.
Typical job looks like: "We got data in this format, APP have to convert them to the inner representation (classes) and then insert them to the server". So You have to write an ASP page, where You allow user to paste/load the file. E.g.: File Upload ASP.NET MVC 3.0
Once You have loaded the file, You need to convert the CSV format (format of stored data) into Your internal representation. Which means create Your own class, with some properties and convert (Transform) CSV into the classes. E.g.: Importing CSV data into C# classes
Since You have this data inside classes (objects - instances of classes), You can work with them and carry out some internal work. This time we are looking for CRUD (Create/Read/Update/Delete) operations against SQL database. First You need to connect to SQL server, choose database and then run the queries. E.g.: https://www.codeproject.com/Articles/837599/Using-Csharp-to-connect-to-and-query-from-a-SQL-da
Plenty of developers are too lazy to write the queries themselves and they like more Object-Oriented access to this sort of problem. They are using ORM - Object-relation mapping, which allows users to have same class/object schema in Database and in the Application. One example for all is Entity-Framework (EF). E.g.: http://www.entityframeworktutorial.net/
As You can see this topic is not so easy and requires knowledge in several parts of programming.

Import DBF or Excel file into SQL Server with dynamic column mapping using asp.net

Recently, I've been importing a DBF file of approx 50 to 250 millions records with discrete columns structure into database table.
Suppose I'm having an database table "RawUpload" which consists of 68 columns to save uploaded details. This database Table structure is same for dbase/Excel file for unMapped columns. Consider a case where dbase File A1 has column name 'TrxnNo' and in another File A2 has column name 'TrxnNum' Or 'TradeNo' which alternately mapped to the database table column TransactionNumber nvarchar(255). Similarly, their are approx. 10 to 30 column in files which can changed frequently. It's hard to track them, so i wanted to go with dynamic column Mapping with database table at runtime.
To accomplish this task I've tried following cases :
Using SSIS package but not able to do with dynamic column structure.
Using SqlBulkCopy which taking more time to complete and even results to timeout and application hang state.
Using DTS (Interop.DTS) package works fine but fails for huge records.
Now my questions here is: which is the fastest way to import such big files into a SQL Server table with dynamic column mapping which will work for large data ?
Thanks

clarification on logic when creating 100s of tables using sqlite

Using winforms, c# , sqlite.
My current app takes in data from text files and stores them in three respective tables. It then uses these tables to give a variety of output based on the user's selection.
Currently this app only deals with one text file but I need to make it process 100s of text files. Ie, read each text file data store it in tables etc.
... Then I well have 3 tables multiplied by the 100s of text files(3 tables for each file).
1) is it possible to maintain this many tables in sqlite?
2) how do I ensure my tables don't just get overwritten by the next file's value? Can someone put sample code for how they would approach this?
SQLite has no limit on the number of tables.
Each table must have a unique name.
However, it would be a better idea to normalize your database, i.e., use a single table with an additional column that specifies the original file of the record.

Combine Data from multiple DataSets

I'm loading data from multiple xml files with different schemas into DataSets. I do have foreign key style relationships between the tables in each xml file but to date they're only enforced by code. I need to access data coming from multiple files and display it in a DataGridView.
Is there a way to merge the data from multiple files into a single DataSet?
Alternately can I write linq to dataset queries across multiple DataSets?
Perhaps the DataSet.Merge() method will help you out? You can simply load the files as you're currently doing, and merge them together.

Categories