Saving via Entity Framework takes a lot of resources - c#

I'm using Sql Server Db conntected to my ASP.NET Mvc Application via Entity Framework.
Application is deployed on Virtual Machine, which is managing by Cluster. Curently max memory is setup to 32 GB. I could up it to 64 GB.
One of avaliable solution for users is loading csv files to database. Every single file is one day, which is consisting of 80-90k records with 200 columns. For this solution I used this setup:
public ActionResult AddRecordsFromCSVFile2(FileInput fileInputs)
{
BWSERWISEntities bWDiagnosticEntities2 = new BWSERWISEntities();
bWDiagnosticEntities2.Configuration.AutoDetectChangesEnabled = true;
bWDiagnosticEntities2.Configuration.ValidateOnSaveEnabled = false;
var Files = fileInputs.File;
if (ModelValidate())
{
foreach (var fileInput in Files)
{
List<VOLTER_LOGS> LogsToAdd = new List<VOLTER_LOGS>();
using (MemoryStream memoryStream = new MemoryStream())
{
fileInput.InputStream.CopyTo(memoryStream);
memoryStream.Seek(0, SeekOrigin.Begin);
{
using (var streamReader = new StreamReader(memoryStream))
{
var csvConfig = new CsvConfiguration(CultureInfo.InvariantCulture)
{
Delimiter = ";",
HasHeaderRecord = false
};
using (var csvReader = new CsvReader(streamReader, csvConfig))
{
var records = csvReader.GetRecords<ReadAllRecords>().ToList();
var firstRecord = records.Where(x => x.Parametr4 != "0" && x.Parametr5 != "0" && x.Parametr6 != "0").FirstOrDefault();
var StartDateToCheck = Convert.ToDateTime(firstRecord.Parametr4 + "-" + firstRecord.Parametr5 + "-" + firstRecord.Parametr6 + " " + firstRecord.Parametr3 + ":" + firstRecord.Parametr2 + ":" + firstRecord.Parametr1, new CultureInfo("en-GB", true));
var lastRecord = records.Where(x => x.Parametr4 != "0" && x.Parametr5 != "0" && x.Parametr6 != "0").LastOrDefault();
var EndDateToCheck = Convert.ToDateTime(lastRecord.Parametr4 + "-" + lastRecord.Parametr5 + "-" + lastRecord.Parametr6 + " " + lastRecord.Parametr3 + ":" + lastRecord.Parametr2 + ":" + lastRecord.Parametr1, new CultureInfo("en-GB", true));
int matchingMachineID = int.Parse(firstRecord.Parametr7);
var matchingElements = bWDiagnosticEntities2.VOLTER_LOGS.Where(x => (x.Parametr7 == matchingMachineID) && (x.Parametr1 >= StartDateToCheck && x.Parametr1 <= EndDateToCheck)).ToList();
bWDiagnosticEntities2.VOLTER_LOGS.RemoveRange(matchingElements);
bWDiagnosticEntities2.SaveChanges();
foreach (var record in records)
{
if (record.Parametr4 != "0" && record.Parametr5 != "0" && record.Parametr6 != "0")
{
bWDiagnosticEntities2.Configuration.AutoDetectChangesEnabled = false;
string date = record.Parametr4 + "-" + record.Parametr5 + "-" + record.Parametr6 + " " + record.Parametr3 + ":" + record.Parametr2 + ":" + record.Parametr1;
DateTime recordDate = DateTime.Parse(date, new CultureInfo("en-GB", true));
// DateTime recordDate = DateTime.ParseExact(date, "dd-MM-yyyy HH:mm:ss", new CultureInfo("en-GB"));
VOLTER_LOGS vOLTER_LOGS = new VOLTER_LOGS();
vOLTER_LOGS.Parametr1 = recordDate;
vOLTER_LOGS.Parametr2 = 0;
vOLTER_LOGS.Parametr3 = 0;
vOLTER_LOGS.Parametr4 = 0;
vOLTER_LOGS.Parametr5 = 0;
vOLTER_LOGS.Parametr6 = 0;
vOLTER_LOGS.Parametr7 = int.Parse(record.Parametr7);
7-200...
vOLTER_LOGS.Parametr200 = int.Parse(record.Parametr200);
LogsToAdd.Add(vOLTER_LOGS);
}
}
}
}
}
bWDiagnosticEntities2.VOLTER_LOGS.AddRange(LogsToAdd);
bWDiagnosticEntities2.SaveChanges();
}
}
}
return View();
}
This implementation takes 5-8 mins and taking like 45-55% of VM's memory, but during the save process sometimes is difficult to get response of other simple request (get first and last date) with error:
The request limited has expired or server is not responding.
Server has a lot of free memory, but table has a 7kk records already and still growing up.
Finalny propably files won't be save during the other users requests.
Now i'm wondering that is it good idea to use Entity Framework for that data volume.
The main goal of Application is to detect errors in data volume over a period of time.
Does anyone happen to have experience with such amount of data processing via Entity Framework?
Is any solution to split resources to every single task, so that every request could get response?

EF is an ORM, not an import tool. An ORM's job is to load graphs of related entities and give the impression of working with objects instead of tables and SQL. There are no entities or graphs in import jobs though, there are files, fields, tables and mappings.
The fastest way to import data SQL Server tables in .NET is to use the SqlBulkCopy class. This uses the same mechanism used in the bcp tool or the BULK INSERT command to import data as a stream, using minimal logging. The input can be a DataTable or an IDataReader instance.
Libraries like CsvHelper or ExcelDataReader provide IDataReader-derived classes out of the box. In other cases, FastMember's ObjectReader can be used to create an IDataReader wrapper over any IEnumerable.
Loading a CSV into a table with CsvHelper and SqlBulkCopy could be as simple as this, provided the file and table columns match:
public async Task SimpleImport(string path, string table,
string connectionString)
{
using var reader = new StreamReader(path);
var csvConfig = new CsvConfiguration(CultureInfo.InvariantCulture)
{
Delimiter = ";"
};
using var csv = new CsvReader( reader, csvConfig);
using var dr = new CsvDataReader(csv);
using var bcp = new SqlBulkCopy(connectionString);
bcp.DestinationTableName = table;
await bcp.WriteToServerAsync(dr);
}
If there are no headers, they'll have to be provided through CsvHelper's configuration, eg through GetDynamicPropertyName or a class map :
string[] fields=new[] {.....};
csvConfig.GetDynamicPropertyName = args=> {
if (args.FieldIndex<10)
{
return fields[args.FieldIndex];
}
else
{
return $"Value_{args.FieldIndex}";
}
};
Using a ClassMap allows specifying type conversions, including complex ones
public class Foo
{
...
public int Year { get; set; }
public int Month { get; set; }
public int Day { get; set; }
public DateTime Date { get; set; }
}
public class FooMap : ClassMap<Foo>
{
public FooMap()
{
AutoMap(CultureInfo.InvariantCulture);
Map(m => m.Date).Convert(row => DateFromParts(row.GetField(5),row.GetField(6),row.GetField(7));
}
DateTime DateFromParts(string year, string month, string say)
{
return DateTime.Parse($"{year}-{month}-{day}");
}
}

Related

I want to Optimize upload time of Excel file records saving to Database in asp.net mvc

I want to upload the excel file of record 2500. This process takes time more than 5 minutes approximate. I want to optimize it to less than a minute maximum.
Find the best way of optimization of that code. I am using Dbgeography for location storing. File is uploaded and save the data properly. Everything is working find but I need optimization.
public ActionResult Upload(HttpPostedFileBase FileUpload)
{
if (FileUpload.ContentLength > 0)
{
string fileName = Path.GetFileName(FileUpload.FileName);
string ext = fileName.Substring(fileName.LastIndexOf('.'));
Random r = new Random();
int ran = r.Next(1, 13);
string path = Path.Combine(Server.MapPath("~/App_Data/uploads"), DateTime.Now.Ticks + "_" + ran + ext);
try
{
FileUpload.SaveAs(path);
ProcessCSV(path);
ViewData["Feedback"] = "Upload Complete";
}
catch (Exception ex)
{
ViewData["Feedback"] = ex.Message;
}
}
return View("Upload", ViewData["Feedback"]);
}
Here is other method from where the file is uploaded and save it to the database..
private void ProcessCSV(string filename)
{
Regex r = new Regex(",(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))");
StreamReader sr = new StreamReader(filename);
string line = null;
string[] strArray;
int iteration = 0;
while ((line = sr.ReadLine()) != null)
{
if (iteration == 0)
{
iteration++;
continue; //Because Its Heading
}
strArray = r.Split(line);
StoreLocation store = new StoreLocation();
store.Name = strArray[0];
store.StoreName = strArray[1];
store.Street = strArray[2];
store.Street2 = strArray[3];
store.St_City = strArray[4];
store.St_St = strArray[5];
store.St_Zip = strArray[6];
store.St_Phone = strArray[7];
store.Office_Phone = strArray[8];
store.Fax = strArray[9];
store.Ownership = strArray[10];
store.website = strArray[11];
store.Retailer = check(strArray[12]);
store.WarehouseDistributor = check(strArray[13]);
store.OnlineRetailer = check(strArray[14]);
string temp_Add = store.Street + "," + store.St_City;
try
{
var point = new GoogleLocationService().GetLatLongFromAddress(temp_Add);
string points = string.Format("POINT({0} {1})", point.Latitude, point.Longitude);
store.Location = DbGeography.FromText(points);//lat_long
}
catch (Exception e)
{
continue;
}
db.StoreLocations.Add(store);
db.SaveChanges();
}
The obvious place to start would be your external call to the Geo Location service, which it looks like you are doing for each iteration of the loop. I don't know that service, but is there any way you can build up a list of addresses in memory, and then hit it once with multiple addresses, then go back and amend all the records you need to update?
The call to db.SaveChanges() which updates the database, this is happening for every line.
while (...)
{
...
db.StoreLocations.Add(store);
db.SaveChanges(); // Many database updates
}
Instead, just call it once at the end:
while (...)
{
...
db.StoreLocations.Add(store);
}
db.SaveChanges(); // One combined database update
This should speed up your code nicely.

Import data to enterprise custom field/look up table

I have a problem. I want to import data from SQL/Excel to lookup table and let enterprise custom field to look it up.
I'm using Microsoft Project 2016 Professional/Online.
I've used C# samples to read tables and their entries. But I'm not able to add to the tables. Maybe someone has suggestions/samples for this kind of problem?
SvcLookupTable.LookupTableDataSet lutDs = new SvcLookupTable.LookupTableDataSet();
LookupEntryCreationInformation creationInfo = new LookupEntryCreationInformation();
LookupEntryValue entryValue = new LookupEntryValue();
Guid entryTableGuid = new Guid("417552b5-7877-e711-80ce-00155d40cd19");
Console.WriteLine("---------------Lookup Tables---------------");
LookupTableCollection lookupTables = projContext.LookupTables;
projContext.Load(lookupTables);
projContext.ExecuteQuery();
entryValue.TextValue = "ITS";
creationInfo.Value = entryValue;
creationInfo.Id = entryTableGuid;
LookupTableCollection lookupTablesColl = projContext.LookupTables;
projContext.Load(lookupTables);
projContext.ExecuteQuery();
int numTables = projContext.LookupTables.Count;
LookupTable lookTable = lookupTables[numTables-1];
Guid rowGuid = new Guid();
foreach (Microsoft.ProjectServer.Client.LookupTable lt in lookupTables)
{
Console.WriteLine(lt.Name + " {" + lt.Id + "}");
ImportLookupTable importTable = new ImportLookupTable();
importTable.AddLookupTableValues(lutDs, rowGuid, rowGuid,"ITS","Skyr");
projContext.Load(lt.Entries);
projContext.ExecuteQuery();
foreach (LookupEntry entry in lt.Entries)
{
Console.WriteLine(" " + entry.FullValue + " {" + entry.Id + "}");
}
}

Change the name of headers in CSV file using CSVHelper in C#

I am using CSV Helper library to produce CSV files for the user to
to populate and upload into the system. My issue is that the WriteHeader method just writes the attributes of a class with names like "PropertyValue", which is not user friendly. Is there a method I can use to make the text produced user friendly but is still able to successfully map the class to the files data?
My code looks like the following:
public ActionResult UploadPropertyCSV(HttpPostedFileBase file)
{
List<PropertyModel> properties = new List<PropertyModel>();
RIMEDb dbContext = new RIMEDb();
bool success = false;
foreach (string requestFiles in Request.Files)
{
if (file != null && file.ContentLength > 0 && file.FileName.EndsWith(".csv"))
{
using(StreamReader str = new StreamReader(file.InputStream))
{
using(CsvHelper.CsvReader theReader = new CsvHelper.CsvReader(str))
{
while (theReader.Read())
{
RIMUtil.PropertyUploadCSVRowHelper row = new RIMUtil.PropertyUploadCSVRowHelper()
{
UnitNumber = theReader.GetField(0),
StreetNumber = theReader.GetField(1),
StreetName = theReader.GetField(2),
AlternateAddress = theReader.GetField(3),
City = theReader.GetField(4)
};
Property property = new Property();
property.UnitNumber = row.UnitNumber;
property.StreetNumber = row.StreetNumber;
property.StreetName = row.StreetName;
property.AlternateAddress = row.AlternateAddress;
property.City = dbContext.PostalCodes.Where(p => p.PostalCode1 == row.PostalCode).FirstOrDefault().City;
dbContext.Properties.Add(property);
try
{
dbContext.SaveChanges();
success = true;
}
catch(System.Data.Entity.Validation.DbEntityValidationException ex)
{
success = false;
RIMUtil.LogError("Ptoblem validating fields in database. Please check your CSV file for errors.");
}
catch(Exception e)
{
RIMUtil.LogError("Error saving property to database. Please check your CSV file for errors.");
}
}
}
}
}
}
return Json(success);
}
I'm wondering if theres some metadata tag or something I can put on top of each attribute in my PropertyUploadCSVRowHelper class to put the text I want produced in the file
Thanks in advance
Not sure if this existed 2 years ago but now, we can change the property/column name by using the following attribute function:
[CsvHelper.Configuration.Attributes.Name("Column/Field Name")]
Full code:
using CsvHelper;
using System.Collections.Generic;
using System.IO;
namespace Test
{
class Program
{
class CsvColumns
{
private string column_01;
[CsvHelper.Configuration.Attributes.Name("Column 01")] // changes header/column name Column_01 to Column 01
public string Column_01 { get => column_01; set => column_01 = value; }
}
static void Main(string[] args)
{
List<CsvColumns> csvOutput = new List<CsvColumns>();
CsvColumns rows = new CsvColumns();
rows.Column_01 = "data1";
csvOutput.Add(rows);
string filename = "test.csv";
using (StreamWriter writer = File.CreateText(filename))
{
CsvWriter csv = new CsvWriter(writer);
csv.WriteRecords(csvOutput);
}
}
}
}
This might not be answering your question directly as you said you wanted to use csvhelper, but if you're only writing small size files (this is a simple function that I use to generate csv. Note, csvhelper will be much better for larger files as this is just building a string and not streaming the data.
Just customise the columns array in the code below variable to suit your needs.
public string GetCsv(string[] columns, List<object[]> data)
{
StringBuilder CsvData = new StringBuilder();
//add column headers
string[] s = new string[columns.Length];
for (Int32 j = 0; j < columns.Length; j++)
{
s[j] = columns[j];
if (s[j].Contains("\"")) //replace " with ""
s[j].Replace("\"", "\"\"");
if (s[j].Contains("\"") || s[j].Contains(" ")) //add "'s around any string with space or "
s[j] = "\"" + s[j] + "\"";
}
CsvData.AppendLine(string.Join(",", s));
//add rows
foreach (var row in data)
{
for (int j = 0; j < columns.Length; j++)
{
s[j] = row[j] == null ? "" : row[j].ToString();
if (s[j].Contains("\"")) //replace " with ""
s[j].Replace("\"", "\"\"");
if (s[j].Contains("\"") || s[j].Contains(" ")) //add "'s around any string with space or "
s[j] = "\"" + s[j] + "\"";
}
CsvData.AppendLine(string.Join(",", s));
}
return CsvData.ToString();
}
Here is a fiddle example of how to use it: https://dotnetfiddle.net/2WHf6o
Good luck.

sort data from csv to txt

I am writing a application which sorts out data from a csv to txt.
I have witten but I cannot get the required output.
Can someone please help, I do not see were I went wrong.
I initially thought its the File.WriteAllLines which was the problem but even when I write to a console I get the same results.
My file looks something like this
Georgina,Sinclair,408999703657,cheque,First National Bank,Fourways,275.00,12/01/2012
Zachary,Whitehead,409122372301,cheque,ABSA,Irene,70.25,12/01/2012
Toby,Henderson,401255489873,cheque,First National Bank,Edenvale,181.03,12/13/2012
Katherine,Cooke,409155874935,savings,ABSA,Southdowns,975.89,01/01/2013
Bradley,James,409254998,savings,ABSA,Melville,207.74,12/09/2012
Sophie,Lane,409771987,savings,ABSA,Roodepoort,207.74,12/31/2012
My output should be something like this
First National B0020000045603
GSinclair 408999703657 CH Fourways 002750001122012
THenderson 401255489873 CH Edenvale 001810313122012
ABSA 0040000146162
ZWhitehead 409122372301 CH Irene 000702501122012
KCooke 409155874935 SAVSouthdowns009758901012013
BJames 409254998 SAVMelville 002077409122012
SLane 409771987 SAVRoodepoort002077431122012
The code I currently have only returns the header and 2 lines which looks as follows.
ABSA 0040000146162
KCooke 409155874935 SAVSouthdowns 009758901012013
Please assist.
My code looks as follows
string text = #"C:\\Test\\output.txt";
var inputEntries = File.ReadLines(#"C:\\Test\\debitorders.csv").Select(line =>
{
var values = line.Split(',');
return new
{
accountholder = values[0].Trim().Substring(0, 1) + values[1].Trim(),
accountnumber = long.Parse(values[2].Trim()),
accounttype = values[3].Trim(),
bankname = values[4].Trim(),
branch = values[5].Trim(),
amount = 100 * double.Parse(values[6].Trim()),
date = DateTime.Parse(values[7].Trim())
};
});
var banks = inputEntries
.OrderBy(e => e.bankname)
.GroupBy(e => e.bankname, e => e);
foreach (var bank in banks)
{
var AccountName = bank.Key;
if (AccountName.Length >= 20)
{
AccountName = AccountName.Substring(0, 16);
}
else
{
AccountName += new string(' ', 20 - AccountName.Length);
}
var NumberOfAccounts = bank.Count();
var TotalAmount = bank.Select(acc => acc.amount).Sum();
var Header = AccountName + "\t" + NumberOfAccounts.ToString("000") + TotalAmount.ToString("0000000000");
var sortedAccounts = bank
.OrderBy(acc => acc.accountholder)
.OrderByDescending(acc => acc.amount);
foreach (var account in sortedAccounts)
{
var outputLine =
account.accountholder + "\t" +
account.accountnumber + "\t" +
//get first 2 characters
account.accounttype.Substring(0, 3).ToUpper() + account.branch + "\t" + "00" +
account.amount +
account.date.ToString("ddMMyyyy");
for (int i = 0; i < 15; i++)
{
File.WriteAllText(text, Header + Environment.NewLine + outputLine);
Console.WriteLine(Header + outputLine);
Console.ReadLine();
}
}
}
A better and cleaner solution will be to make use of List<string> to which you add your text. At the end of the code just convert the list to an array and write all lines to a file.
List<string> outputLine = new List<string>(); //note this addition to the code
foreach (var bank in banks)
{
//do header formatting stuff here
var Header = somecode
outputLine.Add(Header); //Add Header to outputLine
var sortedAccounts = bank.OrderBy(acc => acc.accountholder)
.OrderByDescending(acc => acc.amount);
foreach (var account in sortedAccounts)
{
var tempStringBuilder =
account.accountholder + "\t" +
account.accountnumber + "\t" +
//get first 2 characters
account.accounttype.Substring(0, 3).ToUpper() + account.branch + "\t" + "00" +
account.amount +
account.date.ToString("ddMMyyyy");
outputLine.Add(tempStringBuilder); //Add tempStringBuilder to outputLine
}
}
File.WriteAllLines("destination path", outputLine.ToArray()); //Write everything to your output file in one go
Alternative Excel solution:
Microsoft Excel has a really powerful tool called Pivot Tables, which is ideally suited to your needs. If you are unfamiliar with it, read some tutorials about it. At first it's a process to get your head around the workflow to use it but it is quite simple once you've grasped it. You just drag and drop fields by which you want to group.
You might also want to consider using Data Connections to link to your original data, which is also quite simple given the dataset you have.
I think i found the solution:
File.AppendAllText(text, Header + Environment.NewLine + outputLine + Environment.NewLine);
Use File.AppendAllText instead of File.WriteAllText. With WriteAllText you always deleted the old content.
But consider to clean the file (File.WriteAllText(text, "");) before you begin to write on it, otherwise you will have the old data from last run also in it.
Try to use String.Format("{0,-10}", name) which means that the length of the name is filled up with spaces up to the length of 10. Minus means left alignment and positive causes right alignment.
I updated your code to:
string text = #"D:\C#\output.txt";
File.WriteAllText(text, "");
var inputEntries = File.ReadLines(#"D:\c#\debitorders.csv").Select(line =>
{
var values = line.Split(',');
return new
{
accountholder = values[0].Trim().Substring(0, 1) + values[1].Trim(),
accountnumber = long.Parse(values[2].Trim()),
accounttype = values[3].Trim(),
bankname = values[4].Trim(),
branch = values[5].Trim(),
amount = 100 * double.Parse(values[6].Trim()),
date = DateTime.ParseExact(values[7].Trim(), "MM/dd/yyyy", CultureInfo.InvariantCulture)
};
});
var banks = inputEntries.OrderBy(e => e.bankname)
.GroupBy(e => e.bankname, e => e);
foreach (var bank in banks)
{
var AccountName = bank.Key;
var NumberOfAccounts = bank.Count();
var TotalAmount = bank.Select(acc => acc.amount).Sum();
var Header = String.Format("{0,-20} {1,-10} {2}", AccountName, NumberOfAccounts.ToString("000"), TotalAmount.ToString("0000000000"));
var sortedAccounts = bank.OrderBy(acc => acc.accountholder)
.OrderByDescending(acc => acc.amount);
File.AppendAllText(text, Header + Environment.NewLine);
Console.WriteLine(Header);
foreach (var account in sortedAccounts)
{
var outputLine = String.Format("{0,-11} {1,15} {2,-3} {3,-10} {4,7} {5,-10}",
account.accountholder,
account.accountnumber,
account.accounttype.Substring(0, 3).ToUpper(),
account.branch,
account.amount,
account.date.ToString("ddMMyyyy")
);
//get first 2 characters
//account.accounttype.Substring(0, 3).ToUpper() + account.branch + "\t" + "00" +
// what are the "00" for? didn't include them you may this do by yourself
File.AppendAllText(text, outputLine + Environment.NewLine);
Console.WriteLine(outputLine);
}
File.AppendAllText(text, Environment.NewLine);
Console.WriteLine();
}
Output is:
ABSA 004 0000146162
KCooke 409155874935 SAV Southdowns 97589 01012013
BJames 409254998 SAV Melville 20774 09122012
SLane 409771987 SAV Roodepoort 20774 31122012
ZWhitehead 409122372301 CHE Irene 7025 01122012
First National Bank 002 0000045603
GSinclair 408999703657 CHE Fourways 27500 01122012
THenderson 401255489873 CHE Edenvale 18103 13122012

Get the field list of flat file connection?

How to write a function (external function, c#, f# or powershell script, etc)
List<string> GetFields(string ssisPackageName, string fileSourceName);
to get the field list of a SSIS package? Since the package is an Xml file, can xquery be used to get the list?
Or even better, get more information,
class Field
{
public string Name { get; set; }
public string Type { get; set; }
}
List<Field> GetFields(string ssisPackageName, string fileSourceName);
#billinkc is right, you should keep data typing issues in mind. That said, you could at best retrieve the Code Page and Unicode values for the Flat File Connection Manager itself. The following code should get you started, where you might need some lookups for the code page and data type attributes.
string path = #"MyPathTo\Package.dtsx";
XNamespace dts = "www.microsoft.com/SqlServer/Dts";
XDocument doc = XDocument.Load(path);
// get all connections
var connections = from ele in doc.Descendants(dts + "ConnectionManager")
where ele.Attributes(dts + "ObjectName").Count() != 0
select ele;
foreach (var connection in connections)
{
// look for your flat file connection
if (connection.Attribute(dts + "ObjectName").Value == "Flat File Connection Manager")
{
var connectionDetails = connection.Element(dts + "ObjectData").Element(dts + "ConnectionManager");
Console.WriteLine("CodePage: " + connectionDetails.Attribute(dts + "CodePage").Value);
Console.WriteLine("Unicode: " + connectionDetails.Attribute(dts + "Unicode").Value);
var columnList = connection.Descendants(dts + "FlatFileColumn");
foreach (var column in columnList)
{
Console.WriteLine("Column name: " + column.Attribute(dts + "ObjectName").Value);
Console.WriteLine("Column type: " + column.Attribute(dts + "DataType").Value);
}
}
}

Categories