I am writing the below structure in to the file using Binary Writer.
File Structure:
Num of Employees {employee num & name, employee1 num & name, employee2 num & name...}.
I will get a command with employee num to delete that particualr employee details and update Num of employees feild accordingly.
What is the best way to do the above operation?
Regards
Raju
The question is a little unclear, but when I'm working with a File I always try to deserialize structued data like that to classes.
In general, you'd get a list of Employee objects from the file, remove the item you want, and then write the list back out to the file (or DB, or any other thing you need to write the data to). Quick example (I haven't tried this, so it might not compile as is:)
//EmpList is a List<Employee>
public void DeleteEmployee(int employeeNumber)
{
//Assumes you will never have multiple entries with the same EE#, or that
//if you do you want to delete all records
for(int i = EmpList.Count -1; i>= 0; i--)
{
if(EmpList[i].EmployeeNumber == employeeNumber)
EmpList.Remove(EmpList[i]);
}
//Call code to reserialize (to file, db, whatever)
}
Related
I've some data (row) of some table in a db temp that I want to copy on another db (not temp !) but with the same tablename that has the same number, type and order of columns !
So my idea was to create a function to avoid to repeat myself :
public int copyDB(string **tableName**, int[] **id**)
{
for(int i = 0; i < **id**.Count(); i++)
{
var temp = from t in dbtemp.+(**tableName**)+ where t.student_id == **id**[i]);
}
(...)
}
Is it possible to write i function like this? I will just need to call the function where i want to with the correct args.
I'm still looking also how to copy data, I've found some interesting link like this. I still need to try them.
Thanks for your help
I have some problems with collection lists. I use the list for saving some customer data which i collect from xml and textfiles.
// First i create an instance of the list
List<Customer> cusList = new List<Customer>();
// Save files
String[] somefiles = Directory.GetFiles(//FromPath");
// Then i loop through some files and collect data for the list
for (int i=0; i<somefiles.length; i++)
{
if (some statements match)
{
// call a methode and save file data to the cuslist
cuslist= callmethode(somefiles);
}
else
{
System.Console.WriteLine("Do nothing");
}
}
I want to extend the list for all files, but at the moment i get after the loop only data from the last file.
How can i handle it, that it saves all the data from all files?
Kind regards
When you write cuslist= callmethode(file); you re-assign the list at every iteration. What you need instead is something like this:
cuslist.AddRange(callmethode(file));
This will just add the elements returned from the method to your list instead of replacing the entire list.
If the method just returns one single element use cuslist.Add instead.
HimBromBeere explained the issue and provided the correct answer, just as a side-note:
You could use LINQ to simplify this task:
List<Customer> cusList = somefiles
.Where(f => some statements match)
.SelectMany(f => callmethode(f))
.ToList();
I am designing a program that will take a file input, in the programs case, a census file that has 4 different inputs age, gender, marital status, and district.
My question's are
How exactly could I take the input and sort them into arrays, both integer (age and district) and string (marital status and gender) data types
How do I use them to count how many of each there are?
Any suggestions will help! I know how to read in the file and separate the info using input.Split(',') to separate whenever there is a comma, however, I am having trouble looping through so it doesn't repetitively loop unnecessarily.
You could do start doing something like this, this code uses Linq.
var records = File.ReadAllLines(filepath) // read all lines
.Select(line=> line.Split(',')) // Process each line one by one and split.
.Select(s=> new // Convert to (anonymous)object with properties.
{
Age = int.Parse(s[0]),
Gender= s[1],
MaritalStatus,= s[2],
Status= s[3],
District = int.Parse(s[4]),
}).ToList();
Now you can access each record using
foreach(var record in records)
{
// logic
Console.WriteLine(record);
}
and Count using
int count = records.Count();
I have two DataGridView in the main form and the first one displays data from SAP and another displays data from Vertica DB, the FM I'm using is RFC_READ_TABLE, but there's en exception when calling this FM, which is, if there are too many columns in target table, SAP connector will returns an DATA_BUFFER_EXCEED exception, is there any other FMs or ways to retrieving data from SAP without exception?
I figured out a solution, is about split fields into several arrays and store each parts data into a datatable, then merge datatables, but I'm afraid it will cost a lot of time if the row count is too large.
screenshot of the program
here comes my codes:
RfcDestination destination = RfcDestinationManager.GetDestination(cmbAsset.Text);
readTable = destination.Repository.CreateFunction("RFC_READ_TABLE");
/*
* RFC_READ_TABLE will only extract data up to 512 chars per row.
* If you load more data, you will get an DATA_BUFFER_EXCEEDED exception.
*/
readTable.SetValue("query_table", table);
readTable.SetValue("delimiter", "~");//Assigns the given string value to the element specified by the given name after converting it appropriately.
if (tbRowCount.Text.Trim() != string.Empty) readTable.SetValue("rowcount", tbRowCount.Text);
t = readTable.GetTable("DATA");
t.Clear();//Removes all rows from this table.
t = readTable.GetTable("FIELDS");
t.Clear();
if (selectedCols.Trim() != "" )
{
string[] field_names = selectedCols.Split(",".ToCharArray());
if (field_names.Length > 0)
{
t.Append(field_names.Length);
int i = 0;
foreach (string n in field_names)
{
t.CurrentIndex = i++;
t.SetValue(0, n);
}
}
}
t = readTable.GetTable("OPTIONS");
t.Clear();
t.Append(1);//Adds the specified number of rows to this table.
t.CurrentIndex = 0;
t.SetValue(0, filter);//Assigns the given string value to the element specified by the given index after converting it appropriately.
try
{
readTable.Invoke(destination);
}
catch (Exception e)
{
}
first of all you should use BBP_READ_TABLE if it is available in your system. This one is better for much reasons. But that is not the point of your question. In RFC_READ_TABLE you have two Imports ROWCOUNT and ROWSKIPS. You have to use them.
I would recommend you a rowcount between 30.000 and 60.000. So you have to execute the RFC several times and each time you increment your ROWSKIPS. First loop: ROWCOUNT=30000 AND ROWSKIPS = 0, Second Loop: ROWCOUNT=30000 AND ROWSKIPS=30000 and so on...
Also be careful of float-fields when using the old RFC_READ_TABLE. There is one in table LIPS. This RFC has problems with them.
Use transaction
BAPI
press filter and set to all.
Under Logistics execution you will find deliveries.
The detail screen shows the function name.
Test them directly to find one thats suits then call that function instead of RFC_read_tab.
example:
BAPI_LIKP_GET_LIST_MSG
Another possibility is to have an ABAP RFC function developped to get your datas (with the advantage that you can get a structured / multi table response in one call, and the disadvantage that this is not a standard function / BAPI)
I would like to store values in a text file as comma or tab seperated values (it doesn't matter).
I am looking for a reusable library that can manipulate this data in this text file, as if it were a sql table.
I need select * from... and delete from where ID = ...... (ID will be the first column in the text file).
Is there some code plex project that does this kind of thing?
I do not need complex functionality like joining or relationships. I will just have 1 text file, which will become 1 database table.
SQLite
:)
Use LINQ to CSV.
http://www.codeproject.com/KB/linq/LINQtoCSV.aspx
http://www.thinqlinq.com/Post.aspx/Title/LINQ-to-CSV-using-DynamicObject.aspx
If its not CSV in that case
Let your file hold one record per line. Each record at runtime should be read into a Collection of type Record [assuming Record is custom class representing individual record]. You can do LINQ operations on the collection and write back the collection into file.
Use ODBC. There is a Microsoft Text Driver for csv-Files. I think this would be possible. I don't have tested if you can manipulate via ODBC, but you can test it easily.
For querying you can also use linq.
Have you looked at the FileHelpers library? It has the capability of reading and parsing a text file into CLR objects. Combining that with the power of something like LINQ to Objects, you have the functionality you need.
public class Item
{
public int ID { get; set; }
public string Type { get; set; }
public string Instance { get; set; }
}
class Program
{
static void Main(string[] args)
{
string[] lines = File.ReadAllLines("database.txt");
var list = lines
.Select(l =>
{
var split = l.Split(',');
return new Item
{
ID = int.Parse(split[0]),
Type = split[1],
Instance = split[2]
};
});
Item secondItem = list.Where(item => item.ID == 2).Single();
List<Item> newList = list.ToList<Item>();
newList.RemoveAll(item => item.ID == 2);
//override database.txt with new data from "newList"
}
}
What about data delete. LINQ for query, not for manipulation.
However, List provides a predicate-based RemoveAll that does what you
want:
newList.RemoveAll(item => item.ID == 2);
Also you can overview more advanced solution "LINQ to Text or CSV Files"
I would also suggest to use ODBC. This should not complicate the deployment, the whole configuration can be set in the connection-string so you do not need a DSN.
Together with a schema.ini file you can even set column names and data-types, check this KB article from MS.
sqllite or linq to text and csv :)