Related
This one is a strange one. I am trying to save a polygon from Google maps into MS SQL, via an MVC controller. The problem is that the first time I do it, it works, the second time it gives me the error:
The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 3 ("#2"): The supplied value is not a valid instance of data type geography. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.
I am using EntityFramework 6.1.3, code first. The error appears on the commit line below:
var newPoly = new GenericPolygon()
{
Name = webShape.Name,
PolyShape = shapePolygon,
IsEnabled = true,
IsDeleted = false
};
_unitOfWork.PolygonRepository.Add(newPoly);
_unitOfWork.Commit();
The SQL table structure is the same as the class except that it has an int ID identity column as well, and the name is a varchar(255). The PolyShape column is of type geography.
The shapePolygon variable is defined like this, with the class adding a read-only property called "LongLat", which is used to switch from the Google LatLong to the MS LongLat format:
var shapePolygon = DbGeography.PolygonFromText("POLYGON((" + webShape.LongLat + "))", 4326);
The commit line itself calls the db context save method (I'm using UoW pattern to cut down on code):
this.context.SaveChanges();
I can't for the life of me figure out why it works once, and then not again, unless I restart my VS (running VS 2013 with IIS Express - SQL 2008 R2 Enterprise on a server).
Any help or pointers would be appreciated :-)
I seem to have narrowed down on the issue, and whilst it is more of a workaround than an answer this may help someone else.
The issue is the version of SQL Server, namely SQL 2008 R2 10.50.4000. I migrated my database to SQL Server 2012 build 11.0.5058, after which the code worked, every time.
Hope this helps someone!
I just had this and solved it by reversing the points in the polygon. Apparently SQL Server is left handed with these things or something.
So instead of having a string concatenation like strGeog += string.Format("{0} {1}, ", latlong[0], latlong[1]); I changed it to:
foreach (XmlNode xnPoly in xmlPolyList)
{
strGeog = "";
firstlatlong = null;
if (xnPoly["coordinates"] != null)
{
latlongpairs = xnPoly["coordinates"].InnerText.Replace("\n", "").Split(' ');
foreach (string ll in latlongpairs)
{
latlong = ll.Split(',');
if (firstlatlong == null) firstlatlong = latlong;
strGeog = string.Format("{0} {1}, ", latlong[0], latlong[1]) + strGeog;
}
}
if (strGMPoly.Length > 0)
{
strGeog = strGeog.Substring(0, strGeog.Length - 2); //trim off the last comma and space
strGeog = "POLYGON((" + string.Format("{0} {1} ", firstlatlong[0], firstlatlong[1]) + strGeog + "))"; // conversion from WKT needs it to come back to the first point.
}
i++;
dbPCPoly = new PostCodePolygon();
dbPCPoly.geog = DbGeography.PolygonFromText(strGeog, 4326);
LocDB.PostCodePolygons.Add(dbPCPoly);
LocDB.SaveChanges();
Console.WriteLine(string.Format("Added Polygon {0} for Postcode ({1})", dbPCPoly.PCPolyID, dbPC.PostCodeName));
}
I'm creating an application that loads data from SQL Database once a day and saves it into a text file.
The main table is a "Transactions" table, which holds data about all transactions made on that day. One of the columns represents a middle-man call sign.
My program saves the data in a DataTable first and then with a StringBuilder I give it the proper form and finally save it into a text file with StreamWriter.
My question is, how or on which stage of the process can I distinguish one table entry from another. I want to create two files: one with transactions made by middle-man A and B.
This is my code so far:
// Query for Data
row = new SqlDataAdapter("SELECT [MSISDN], [Amount], [Transaction_ID], POS.[Name], MNO.[Call Sign] FROM"
+ "[Transactions] join [POS] "
+ "on Transactions.POS_ID = POS.idPOS "
+ "join [MNO] on Transactions.MNO_ID = MNO.idMNO "
+ "where [Status] = '1'", con);
row.Fill(Row);
// Save Data in StringBuilder
for (int i = 0; i < Row.Rows.Count; i++)
{
sb.Append(Row.Rows[i].ItemArray[0].ToString()).Append(",");
double amount = Convert.ToDouble(Row.Rows[i].ItemArray[1].ToString());
sb.Append(Math.Round(amount, 2).ToString().Replace(",", ".")).Append(",");
sb.Append(Row.Rows[i].ItemArray[2].ToString()).Append(",");
sb.Append(Row.Rows[i].ItemArray[3].ToString()).Append(",");
sb.Append(Row.Rows[i].ItemArray[4].ToString()).Append(",").Append(Environment.NewLine);
}
// Create a file from StringBuilder
mydocpath = #"C:\Transactions\" + fileDate.ToString(format) + ".txt";
FileStream fsOverwrite = new FileStream(mydocpath, FileMode.Create);
using (StreamWriter outfile = new StreamWriter(fsOverwrite))
{
outfile.WriteAsync(sb.ToString());
}
Hope I was clear enough. English isn't my strong side. As well as coding for what it seems...
One option.
Put all your data into a DataSet. And then do Xsl transformations against the ds.GetXml().
Here is kind of an example:
http://granadacoder.wordpress.com/2007/05/15/xml-to-xml-conversion/
But what I would do is eliminate the DataTable altogether. Use an IDataReader.
Loop over the data. Maybe do the original query as "Order By Middle-Man-Identifer", and then when the middleManIdentifer "makes a jump", close the previous file and write a new one.
Something like that.
You may be able to learn something from this demo:
http://granadacoder.wordpress.com/2009/01/27/bulk-insert-example-using-an-idatareader-to-strong-dataset-to-sql-server-xml/
Here is a couple of IDataReader helpers:
http://kalit-codesnippetsofnettechnology.blogspot.com/2009/05/write-textfile-from-sqldatareader.html
and
How to efficiently write to file from SQL datareader in c#?
Below given SSIS script task takes in a ticker symbol and the date range you would like to get back and returns a CSV formatted download that can be used to extract the price history but it does not work and I have no idea why. Full information about this SSIS well-thought-out concept can be found in the below link.
SSIS / ETL Example – Yahoo Equity & Mutual Fund Price History
You can download the sample SSIS package from the below link.
Sample package on SkyDrive
Following SSIS script task has no errors but it does not download a file:
I just started to understand this code, and it seemed that this link is subdivided to components like here, and with the correct values it must work but I don't understand why it is not retrieving the file.
ichart.finance.yahoo.com/table.csv?s={symbol}&a={startMM}&b={startDD}&c=
{startYYYY}&d={endMM}&e={endDD}&f={endYYYY}&g={res}&ignore=.csv
Script task code that I am using:
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.Configuration;
using System.Collections.Generic;
using System.Data.Sql;
using System.Data.SqlClient;
using System.Net;
using System.Collections.Specialized;
using System.Linq;
using Hash = System.Collections.Generic.Dictionary<string, string>;
namespace ST_361aad0e48354b30b8152952caab8b2b.csproj
{
[System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
#region VSTA generated code
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion
static string dir;
static DateTime end;
const string CSV_FORMAT = "Id,Cusip,Date,Open,High,Low,Close,Volume,Adj Close";
public void Main()
{
// end date is today minus one day (end of day)
end = DateTime.Now;
// output directory stored in SSIS variable
// which can be set at runtime
dir = System.IO.Path.Combine(Dts.Variables["OutputCSV"].Value.ToString(), end.ToString("yyyyMMdd"));
if (!System.IO.Directory.Exists(dir))
System.IO.Directory.CreateDirectory(dir);
// connection string to our database
var connectionString = Dts.Variables["ConnectionString"].Value.ToString();
// the sql command to execute
var sql = Dts.Variables["PriceHistorySqlCommand"].Value.ToString();
var list = new List<Hash>();
using (var cnn = new SqlConnection(connectionString))
{
cnn.Open();
using (var cmd = new SqlCommand(sql, cnn))
{
cmd.CommandTimeout = 0;
var dr = cmd.ExecuteReader();
while (dr.Read())
{
// store result in temporary hash
var h = new Hash();
h["cusip"] = dr["cusip"].ToString();
h["symbol"] = dr["symbol"].ToString();
h["product_id"] = dr["product_id"].ToString();
h["last_price_dt_id"] = dr["last_price_dt_id"].ToString();
list.Add(h);
// process batches of 100 at a time
// (This requires System.Threading.dll (CTP of parallel extensions) to be installed in the GAC)
if (list.Count >= 100)
{
System.Threading.Tasks.Parallel.ForEach(list, item =>
{
var dt = item["last_price_dt_id"].TryGetDateFromDateDimensionId(end.AddYears(-100));
DownloadPriceHistory(item["product_id"], item["cusip"], item["symbol"], dt);
});
list.Clear();
}
}
}
}
// TODO: Add your code here
Dts.TaskResult = (int)ScriptResults.Success;
}
static void DownloadPriceHistory(string id, string cusip, string symbol, DateTime begin)
{
// get write path
var path = System.IO.Path.Combine(dir, cusip + ".csv");
var url = String.Format("http://ichart.finance.yahoo.com/table.csv?s={0}&d={1}&e={2}&f={3}&g=d&a={4}&b={5}&c={6}&ignore=.csv",
symbol.ToUpper(),
(end.Month - 1).ToString("00"), end.Day.ToString("00"), end.Year,
(begin.Month - 1).ToString("00"), begin.Day.ToString("00"), begin.Year);
string csv;
using (WebClient web = new WebClient())
{
try
{
var text = web.DownloadString(url);
var lines = text.Split('\n');
System.Text.StringBuilder sb = new System.Text.StringBuilder();
int i = 0;
foreach (var line in lines)
{
// skip first line its a header
if (i == 0)
sb.AppendLine(CSV_FORMAT);
// ensure line being added is not null
else if (false == String.IsNullOrEmpty(line) && false == String.IsNullOrEmpty(line.Trim()))
sb.AppendLine(id + "," + cusip + "," + line);
i++;
}
// add header and body
csv = sb.ToString();
}
catch (System.Net.WebException)
{
// 404 error
csv = CSV_FORMAT;
}
}
System.IO.File.WriteAllText(path, csv);
}
}
/// <summary>
/// Some simple extension methods.
/// </summary>
public static class ExtensionMethods
{
/// <summary>
/// Gets a datetime object from a dimension id string for example '20090130' would be translated to
/// a proper datetime of '01-30-2009 00:00:00'. If the string is empty than we default to the passed
/// in <paramref name="defaultIfNull"/>.
/// </summary>
/// <param name="str">The string</param>
/// <param name="defaultIfNull">The default null.</param>
/// <returns>Returns the datetime.</returns>
public static DateTime TryGetDateFromDateDimensionId(this string str, DateTime defaultIfNull)
{
if (String.IsNullOrEmpty(str)) return defaultIfNull;
return DateTime.Parse(str.Substring(4, 2) + "/" + str.Substring(6, 2) + "/" + str.Substring(0, 4));
}
}
}
Import ticker price history from Yahoo Finance Chart website using SSIS:
There is another way to import the ticker symbol price history from Yahoo Chart website into database using SSIS. Here is a sample package written using SSIS 2008 R2 with database in SQL Server 2008 R2
Create an SSIS package named (say SO_14797886.dtsx) using Business Intelligence Development Studio (BIDS) and create an OLE DB connection manager/data source that connects to your database. This sample uses the data source OLEDB_Sora.ds that connects to the database Sora on my local machine running the instance KIWI\SQLSERVER2008R2. KIWI is the machine name and SQLSERVER2008R2 is the instance name.
Execute the below given script in the database to create two tables.
Table dbo.TickerSymbols will hold the information about list of ticker symbols and the start and end dates for which you would like to import the price files along with the resolution of the import. Resolution can contain values like d for day; w for weekly; m for monthly; and y for yearly.
Table dbo.TickerPriceHistory will hold the price history information of the symbols downloaded from Yahoo Finance Chart website.
Insert script has added four records for the ticker symbols AAPL (Apple); MSFT (Microsoft); GOOG (Google); and YHOO (Yahoo). Each record is set with different date ranges and resolution.
Script to create tables and insert few ticker symbols data:
CREATE TABLE dbo.TickerSymbols
(
Id int IDENTITY(1,1) NOT NULL
, Symbol varchar(10) NOT NULL
, StartDate datetime NOT NULL
, EndDate datetime NOT NULL
, Resolution char(1) NOT NULL
, CONSTRAINT [PK_TickerSymbols] PRIMARY KEY CLUSTERED ([Id] ASC)
);
GO
CREATE TABLE dbo.TickerPriceHistory
(
Id int IDENTITY(1,1) NOT NULL
, Symbol varchar(10) NOT NULL
, PriceDate datetime NOT NULL
, PriceOpen numeric(18,2) NULL
, PriceHigh numeric(18,2) NULL
, PriceLow numeric(18,2) NULL
, PriceClose numeric(18,2) NULL
, Volume bigint NULL
, AdjustmentClose numeric(18,2) NULL
, CONSTRAINT [PK_TickerPriceHistory] PRIMARY KEY CLUSTERED ([Id] ASC)
);
GO
INSERT INTO dbo.TickerSymbols (Symbol, StartDate, EndDate, Resolution) VALUES
('AAPL', '2012-02-01', '2012-02-04', 'd')
, ('GOOG', '2013-01-01', '2013-01-31', 'w')
, ('MSFT', '2012-09-01', '2012-11-30', 'm')
, ('YHOO', '2012-01-01', '2012-12-31', 'y')
;
GO
On the SSIS package, create the following variables.
EndDate: The package will use this variable of data type DateTime to hold the end date of the symbol being looped through in the record set list.
FileExtension: This variable of data type String will hold the file extension to use for the downloaded files. This is optional.
FileName: This variable of data type String will hold the name of the file for a given symbol. The name is generated based on timestamp to avoid overwriting previously downloaded files. Click the variable and press F4 to view properties. Change the property EvaluateAsExpression to True. Click on the Ellipsis button against Expression to open the Expression Builder. Set the Expression to the following value. This expression will evaluate to value like MSFT_20130210_092519.csv, where MSFT is the symbol and the rest of the information is package start time in the format yyyMMdd_hhmmss and .csv is the file extension.
#[User::Symbol] + "_" + (DT_WSTR, 4) YEAR(#[System::StartTime]) + RIGHT("00" + (DT_WSTR, 2) MONTH(#[System::StartTime]), 2) + RIGHT("00" + (DT_WSTR, 2) DAY(#[System::StartTime]), 2) + "_" + RIGHT("00" + (DT_WSTR, 2) DATEPART("hh", #[System::StartTime]), 2) + RIGHT("00" + (DT_WSTR, 2) DATEPART("mi", #[System::StartTime]), 2) + RIGHT("00" + (DT_WSTR, 2) DATEPART("ss", #[System::StartTime]), 2) + #[User::FileExtension]
FilePath: This variable of data type String will hold the complete path of the downloaded file for a given symbol. Click the variable and press F4 to view properties. Change the property EvaluateAsExpression to True. Click on the Ellipsis button against Expression to open the Expression Builder. Set the Expression to the value #[User::RootFolder] + "\\" + #[User::FileName]. We will use this express to
Resolution: The package will use this variable of data type String to hold the reolution information of the symbol being looped through in the record set list.
RootFolder: This variable of data type String will hold the root folder where the files should be downloaded to.
SQL_GetSymbols: This variable of data type String will contain the T-SQL query to fetch the ticker symbols information from database. Set the value to SELECT Symbol, StartDate, EndDate, Resolution FROM dbo.TickerSymbols
StartDate: The package will use this variable of data type DateTime to hold the start date of the symbol being looped through in the record set list.
Symbol: The package will use this variable of data type String to hold the ticker symbol as it loops through each record in the record set list.
SymbolsList: The package will use this variable of data type Object to hold the result set of ticker symbols stored in the database.
URLYahooChart: This variable of data type String will hold the URL to Yahoo Finance Chart website with place holders to fill in the appropriate values for query string. Set the value to http://ichart.finance.yahoo.com/table.csv?s={0}&a={1}&b={2}&c={3}&d={4}&e={5}&f={6}&g={7}&ignore=.csv
On the package, right-click on the Connection Managers tab and click Flat File Connection...
On the General page of Flat File Connection Manager Editor, perform the following actions:
Set the Name to FILE_TickerPriceHistory
Set the Description to Read the ticker symbol price history.
If you already have a sample file, point to the file location. SSIS will infer the settings from the data in the file. In this case, I already downloaded a file by navigating the URL http://ichart.finance.yahoo.com/table.csv?s=MSFT&a=9&b=1&c=2012&d=11&e=30&f=2012&g=m&ignore=.csv and saved it under the name C:\Siva\StackOverflow\Files\14797886\Data\\MSFT_20130210_092519.csv
Make sure the Format is set to Delimited.
Make sure the Header row delimiter is set to {CR}{LF}
Check the box Column names in the first data row
Click Columns page
On the Columns page of Flat File Connection Manager Editor, make sure that the Row delimiter is set to {LF} and Column delimiter is set to Comma {,}. Click Advanced page.
On the Advanced page of Flat File Connection Manager Editor, the columns will be created based on the file information. Change the values as shown below so that column names match with names in the database. This way the column mapping will be easier. All columns except the last column should have the ColumnDelimiter set to Comma {,}. Lsst column should have the the ColumnDelimiter set to {LF}.
Column Data type DataPrecision DataScale
------------------- ------------------------------------ ------------- ---------
PriceDate date [DT_DATE]
PriceOpen numeric [DT_NUMERIC] 18 2
PriceHigh numeric [DT_NUMERIC] 18 2
PriceLow numeric [DT_NUMERIC] 18 2
PriceClose numeric [DT_NUMERIC] 18 2
Volume eight-byte unsigned integer [DT_UI8]
AdjustmentClose numeric [DT_NUMERIC] 18 2
You should now see both the connection managers at the bottom of the package.
Drag and drop an Execute SQL Task on to the Control Flow tab and perform the following actions on the General tab.
Set the Name to Get symbols from database
Set the Description to Fetch the list of symbols and its download settings from database.
Set the ResultSet to Full result set because the query will return a record set.
Set the ConnectionType to OLE DB
Set the Connection to OLEDB_Sora
Select Variable from SQLSourceType
Select User::SQL_GetSymbols from SourceVariable
Click Result Set page.
On the Result Set page of Execute SQL Task, click Add and set Result Name to 0 indicating the index of the result set. Select User::SymbolsList from the Variable Name to store the result set into object variable.
Drag and drop a Foreach Loop Container and place it after the Execute SQL Task. Connect the Execute SQL Task green arrow to Foreach Loop Container. Double-click Foreach Loop Container to view Foreach Loop Editor. Configure the ForEach Loop Editor as shown below.
On the Variable Mappings page of ForEach Loop Editor, configure it as shown below:
Drag and drop a Script Task inside the ForEach Loop Container. Double-click the Script Task to open the Script Task Editor. On the Script page of the Script Task editor, click the Ellipsis button against the ReadOnlyVariables and select the below listed variables. We need to use these inside the Script Task code.
User::EndDate
User::FileExtension
User::FileName
User::FilePath
User::Resolution
User::RootFolder
User::StartDate
User::Symbol
User::URLYahooChart
Click the Edit Script... button on the Script Task Editor and type the below code. After typing the code, close the Script Task Editor.
Script Task code in C#:
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.Net;
namespace ST_5fa66fe26d20480e8e3258a8fbd16683.csproj
{
[System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
#region VSTA generated code
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion
public void Main()
{
try
{
string symbol = Dts.Variables["User::Symbol"].Value.ToString();
DateTime startDate = Convert.ToDateTime(Dts.Variables["User::StartDate"].Value);
DateTime endDate = Convert.ToDateTime(Dts.Variables["User::EndDate"].Value);
string resolution = Dts.Variables["User::Resolution"].Value.ToString();
string urlYahooChart = Dts.Variables["User::URLYahooChart"].Value.ToString();
string rootFolder = Dts.Variables["User::RootFolder"].Value.ToString();;
string fileExtension = Dts.Variables["User::FileExtension"].Value.ToString();
string fileName = Dts.Variables["User::FileName"].Value.ToString();
string downloadPath = Dts.Variables["User::FilePath"].Value.ToString();
if (!System.IO.Directory.Exists(rootFolder))
System.IO.Directory.CreateDirectory(rootFolder);
urlYahooChart = string.Format(urlYahooChart
, symbol
, startDate.Month
, startDate.Day
, startDate.Year
, endDate.Month
, endDate.Day
, endDate.Year
, resolution);
bool refire = false;
Dts.Events.FireInformation(0, string.Format("Download URL of {0}", symbol), urlYahooChart, string.Empty, 0, ref refire);
WebClient webClient = new WebClient();
webClient.DownloadFile(urlYahooChart, downloadPath);
Dts.TaskResult = (int)ScriptResults.Success;
}
catch (Exception ex)
{
Dts.Events.FireError(0, "Download error", ex.ToString(), string.Empty, 0);
}
}
}
}
Drag and drop a Data Flow Task inside the Foreach Loop Container after the Script Task. Connect the green arrow from Script Task to the Data Flow Task. The Control Flow tab should look as shown below.
On the Data Flow Task, drag and drop Flat File Source and configure it as shown below to read the price history CSV files.
Drag and drop a Derived Column Transformation and create a new column named Symbol with the expression (DT_STR,10,1252)#[User::Symbol] to add the Symbol to the data pipeline.
Drag and drop OLE DB Destination and configure it as shown below to insert the data into the database.
Your Data Flow tab should look like as shown below:
Before running the package, we need to make couple of changes to prevent any warnings or errors on the design time view due to the absence of the files in the folder.
Click the flat file connection manager FILE_TickerPriceHistory and press F4 to view the properties. Change the property DelayValidation to True. This will make sure that the validation of file existence will happen during runtime. Click the Ellipsis button against the Expression and set the ConnectionString property to the value #[User::FilePath]. This will change the file path as each file is being downloaded from the website.
Click the Data Flow Task and press F4 to view the properties. Change the property DelayValidation to True. This will make sure that the validation of file existence will happen during runtime.
Navigate to the Data Flow tab and click the Flat File Source and press F4 to view the properties. Change the property ValidateExternalMetadata to False. This will make sure that the validation of flat file existence will happen during runtime.
Let us navigate to the folder C:\Siva\StackOverflow\Files\14797886, where the downloaded files will be saved is empty. The folder does not have to be empty. This is just for execution check.
Run the following SQL statements against the database to verify the data in the table. The second table should be empty.
SELECT * FROM dbo.TickerSymbols;
SELECT * FROM dbo.TickerPriceHistory;
Execute the package. If everything is set up correctly, the package should run successfully and download the files for each symbol listed in table dbo.TickerSymbols
The files should be successfully saved to the folder C:\Siva\StackOverflow\Files\14797886. Notice that each file is named appropriately based on the expressions provided in the package.
Run the following SQL statement against the database to verify the data in the table. The table dbo.TickerPriceHistory should now have the data from the price files downloaded from website.
SELECT * FROM dbo.TickerPriceHistory;
The above sample package illustrated how to download price files from Yahoo Finance Chart website for a given list of ticker symbols and load them into the database.
using c# openxml - I am attempting to open an excel file, bind to its connection.xml stream, and update the embedded SQL query. I am able to successfully replace individual character sequences withing the connection/command node, but attempting to explicitly set the command attribute (i.e. node.Attribute["command"].Value = select * from ....) is resulting in a corrupted
xmlDoc.Load(wkb.WorkbookPart.ConnectionsPart.GetStream());
csNode = xmlDoc.SelectSingleNode("*/*/*[#connection]");
csNode.Attributes["command"].Value = Regex.Replace(csNode.Attributes["command"].Value, #"\(\[\w*\].\[\w*\].\[\w*\].\[\w*\].*\)", "(" + subQry + ")", RegexOptions.Multiline);
xmlDoc.Save(wkb.WorkbookPart.ConnectionsPart.GetStream());
wkb.Close();
Not sure if this is the only way to solve this issue, but I was able to correct it by deleting the original connections.xml stream and creating/attaching a new one with the correct value to the workbook.
//select connections node from loaded xml Excel
csNode = xmlDoc.SelectSingleNode("*/*/*[#connection]");
//store original node values
oldConnValue = csNode.Attributes["connection"].Value;
oldCommValue = csNode.Attributes["command"].Value;
//delete existing ConnectionsPart - to ensure that bleed-over data is not present
wkb.WorkbookPart.DeletePart(wkb.WorkbookPart.ConnectionsPart);
//create a replacement ConnectionsPart
wkb.WorkbookPart.AddNewPart<ConnectionsPart>();
csNode.Attributes["connection"].Value = oldConnValue; //reassign existing connection value
csNode.Attributes["command"].Value = baseQry; //assign new query
//save changes to stream
xmlDoc.Save(wkb.WorkbookPart.ConnectionsPart.GetStream());
I'm building a system that reads 5 CSV files each month. These files are supposed to follow a certain format and ordering. I have one master table and 5 temporary tables. Each CSV file is read first and then bulk inserted into its corresponding temporary table. After bulk inserting the 5 csv files into their respective temporary tables I once again insert all the records from the temporary table to the master table. This makes sure that all files are uploaded first before inserting the data to the master table.
I built this system using ASP.net and during debugging and testing everything went fine. The problem occurs whenever I deploy the application to a production server. After I deployed the application I used the same csv files I uploaded during development and testing and the system shows a data conversion error from string to date time format.
I tried many things to fix this but it seems the problem still persist. I tried changing the collation of the production database to the same one I used during development. I also tried changing some regional settings in the production server but it still doesn't work.
I thought maybe I can handle this programmatically and instead of bulk inserting from the temporary tables to the master table I would write some kind of a for loop that would insert each record manually to the master table, but then I suppose it would create a performance issue since I'll be inserting around 100,000 records each time.
I wonder if anyone has faced a similar issue during deployment. It still seems weird to me that the behaviour of the application changed after deployment.
following is a portion of the code where it uploads the inventory.csv file to the server and then bulk inserting the csv into a temporary table TB_TEMP_INVENTORY then inserting the records from temp to the master table TB_CATTLE. this is done to 4 other files and is almost identical to this.
OleDbConnection conn = new OleDbConnection(ConfigurationManager.AppSettings["LivestockConnectionString"]);
OleDbCommand comm;
OleDbDataAdapter adapter;
DataTable table = new DataTable();
string file = string.Empty;
string content = string.Empty;
StreamReader reader;
StreamWriter writer;
string month = monthDropDownList.SelectedValue;
string year = yearDropDownList.SelectedItem.Text;
// upload inventory file
file = System.IO.Path.GetFileName(inventoryFileUpload.PostedFile.FileName);
inventoryFileUpload.PostedFile.SaveAs("C://LivestockCSV//" + file);
// clean inventory file
file = "C://LivestockCSV//" + file;
reader = new StreamReader(file);
content = reader.ReadToEnd();
reader.Close();
writer = new StreamWriter(file);
writer.Write(content.Replace("\"", "")); // remove quotation
writer.Close();
writer = new StreamWriter(file);
writer.Write(content.Replace(",NULL,", ",,")); // remove NULL
writer.Close();
writer = new StreamWriter(file);
writer.Write(content.Replace(",0,", ",,")); // remove 0 dates
writer.Close();
writer = new StreamWriter(file);
writer.Write(content.Replace(",0", ",")); // remove 0 dates at eol
writer.Close();
try
{
conn.Open();
comm = new OleDbCommand("TRUNCATE TABLE TB_TEMP_INVENTORY", conn); // clear temp table
comm.ExecuteNonQuery();
// bulk insert from csv to temp table
comm = new OleDbCommand(#"SET DATEFORMAT DMY;
BULK INSERT TB_TEMP_INVENTORY
FROM '" + file + "'" +
#" WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)", conn);
comm.ExecuteNonQuery();
// check if data for same month exists in cattle table
comm = new OleDbCommand(#"SELECT *
FROM TB_CATTLE
WHERE Report='Inventory' AND Month=" + month + " AND Year=" + year, conn);
if (comm.ExecuteScalar() != null)
{
comm = new OleDbCommand(#"DELETE
FROM TB_CATTLE
WHERE Report='Inventory' AND Month=" + month + " AND Year=" + year, conn);
comm.ExecuteNonQuery();
}
// insert into master cattle table
comm = new OleDbCommand(#"SET DATEFORMAT MDY;
INSERT INTO TB_CATTLE(ID, Sex, BirthDate, FirstCalveDate, CurrentUnit, OriginalCost, AccumulatedDepreciation, WrittenDownValue, NetRealizableValue, CapitalGainLoss, Month, Year, Report, Locked, UploadedBy, UploadedAt)
SELECT DISTINCT ID, Sex, BirthDate, FirstCalveDate, CurrentUnit, 0, 0, 0, 0, 0, " + month + ", " + year + #", 'Inventory', 0, 'Admin', '" + DateTime.Now + #"'
FROM TB_TEMP_INVENTORY", conn);
comm.ExecuteNonQuery();
conn.Close();
}
catch (Exception ex)
{
ClientScript.RegisterStartupScript(typeof(string), "key", "<script>alert('" + ex.Message + "');</script>");
return;
}
You don't specify how you are doing the insert, but a reasonable option here would be something like SqlBulkCopy, which can take either a DataTable or an IDataReader as input; this would give you ample opportunity to massage the data - either in-memory (DataTable), or via the streaming API (IDataReader), while still using an efficient import. CsvReader is a good option for loading the CSV.
The other option is to use a very basic insert into the staging table, and massage the data via TSQL code.
Re why has it changed between dev/production; the most likely answers are:
the data you used in dev was not representative
there is an environmental/configuration difference between the two
1) Check SQL Server LANGUAGE and DATEFORMAT settings for dev/testing & production env.:
DBCC USEROPTIONS
2) What date format is used in CSV files (source) ?
3) What data type is used for date/time field (destination) ?
DECLARE #v VARCHAR(10) = '2010-08-23';
SET DATEFORMAT mdy;
SELECT CAST(#v AS DATETIME)
,CAST(#v AS DATE)
,YEAR(CAST(#v AS DATETIME))
,MONTH(CAST(#v AS DATETIME))
,DAY(CAST(#v AS DATETIME));
SET DATEFORMAT dmy;
SELECT CAST(#v AS DATETIME)
,CAST(#v AS DATE);