This question already has answers here:
Temporary table record limit in Sql server
(6 answers)
Closed 8 years ago.
I intend to load data from several temp tables (t_source) located in different databases
and servers into one temp table (t_main) using C# code and SQL server. After that, I want
to write data of t_main into a text file.
My question is that can this cause SQL server to run out of memory (because it will be storing the t_main) ? What is the maximum amount of data that I can store in a temp table of a Sql server 2005 or 2008 database ?
Table size (static or temporary) is only limited by available storage space. They are not stored in memory.
In SQL Server, a temp table has the same storage limits as an ordinary table.
http://msdn.microsoft.com/en-us/library/ms190768.aspx
Related
This question already has answers here:
Identity increment is jumping in SQL Server database
(6 answers)
Closed 6 years ago.
I have a table in EntityFramework that has a field named by ID,
this field is primary key and is Identity.
when i add records into this table, this field value increases per recor, after adding several records, this value suddenly increases
For example, increased from 90 in 1010
While no transaction has been unsuccessful.
what is the problem?
If you are using Azure SQL this can just happen. We had it happen a few times. It is just the nature of how Azure sql works.
See this question Windows Azure SQL Database - Identity Auto increment column skips values it goes into a detailed explanation for a case very similar to yours
This question already has answers here:
How to best handle the storage of historical data?
(2 answers)
Closed 8 years ago.
What is the best practice of keeping past data in the database? For example lets say we have transaction tables like AttendanceTrans, SalaryTrans in a payroll solution. So every month we have to insert hundreds or thousands of new records to these tables. So all the past and current data in same table.
Another approach would be to keep AttendanceHistory and SalaryHistory tables. So that end of every period (month) we empty the Trans tables after coping the data to respective History tables.
When considering factors like performance, ease of report generation, ease of coding and maintenance, what would be the optimum solution?
Note : RDBMS is SQL Server 2008 and programming environment is .NET (C#)
In general you should keep all the data in the same table. SQL Server is great at handling large volumes of data and it will make your life a lot easier (reporting, querying, coding, maintenance) if it's all in one place. If your data is indexed appropriately then you'll be just fine with thousands of new records per month.
In my opinion, best solution in sql server is CDC (Change Data Capture). It very simple to use. You can change volume of historical data with changing schedule of clear job.
I think this is the best way for performance because CDC gets changes from transaction log (it is not triggers on table), but you need to use Full Recovery Model for you database.
This question already has answers here:
Determine SQL Server Database Size
(8 answers)
Closed 9 years ago.
How can one check the SQL Server database and table size (in MB) using C#?
Have you looked at the sp_spaceused stored procedure?
Just execute it with DBName.Tablename.
Here is the MSDN Link
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Speed up update of 185k rows in SQL Server 2008?
I have more than 700.000 contact mails and I want to insert them into SQL Server using C#, in a few minutes, not hours, in a single database call.
I know that pre is impossible but, I know there's way to do it
how can I achieve this?
Have a look at using SqlBulkCopy Class
Lets you efficiently bulk load a SQL Server table with data from
another source.
Microsoft SQL Server includes a popular command-prompt utility named
bcp for moving data from one table to another, whether on a single
server or between servers. The SqlBulkCopy class lets you write
managed code solutions that provide similar functionality. There are
other ways to load data into a SQL Server table (INSERT statements,
for example), but SqlBulkCopy offers a significant performance
advantage over them.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Solutions for INSERT OR UPDATE on SQL Server
Check if a row exists, otherwise insert
I'm using SQL Server 2008 R2, C# .NET 4.0 (and LINQ).
I want to get or create a row in a table and then append a string to one of its columns.
I'm wondering if there's a best practice for this kind of idiom, or if SQL offers this out of the box?
Somehow transaction start -> select -> insert or update -> transaction end doesn't seem elegant enough...