I can't find any documentation that outlines the correct way to use OneDrive to store and keep app files syncrhonised across devices in C#
I have read the documentation at OneDrive Dev Center but I don't understand the http code. (self taught C# only).
I kind of understand that I use the delta method to get changed files from OneDrive, to then save locally, but I can't figure out exactly how, so have gotten around it by checking local vs OneDrive manually using the GetAsync<> methods.
My current implementation (below for reference) seems to me to be rather clunky compared to what is probably handled better in the API.
In addition, it doesn't appear that there is a reverse 'delta' function? That is, where I write a file to the app locally, then tell OneDrive to sync the change. Is that because I need to actually upload it using the PutAsync<> method? (Currently what I am doing)
public async Task<T> ReadFromXML<T>(string gamename, string filename)
{
string filepath = _appFolder + #"\" + gamename + #"\" + filename + ".xml";
T objectFromXML = default(T);
var srializer = new XmlSerializer(typeof(T));
Item oneDItem = null;
int casenum = 0;
//_userDrive is the IOneDriveClient
if (_userDrive != null && _userDrive.IsAuthenticated)
{
try
{
oneDItem = await _userDrive.Drive.Special.AppRoot.ItemWithPath(filepath).Request().GetAsync();
if (oneDItem != null) casenum += 1;
}
catch (OneDriveException)
{ }
}
StorageFile localfile = null;
try
{
localfile = await ApplicationData.Current.LocalFolder.GetFileAsync(filepath);
if (localfile != null) casenum += 2;
}
catch (FileNotFoundException)
{ }
switch (casenum)
{
case 0:
//neither exist. Throws exception to tbe caught by the calling method, which should then instantiate a new object of type <T>
throw new FileNotFoundException();
case 1:
//OneDrive only - should copy the stream to a new local file then return the object
StorageFile writefile = await ApplicationData.Current.LocalFolder.CreateFileAsync(filepath, CreationCollisionOption.ReplaceExisting);
using (var newlocalstream = await writefile.OpenStreamForWriteAsync())
{
using (var oneDStream = await _userDrive.Drive.Special.AppRoot.ItemWithPath(filepath).Content.Request().GetAsync())
{
oneDStream.CopyTo(newlocalstream);
}
}
using (var newreadstream = await writefile.OpenStreamForReadAsync())
{ objectFromXML = (T)srializer.Deserialize(newreadstream); }
break;
case 2:
//Local only - returns the object
using (var existinglocalstream = await localfile.OpenStreamForReadAsync())
{ objectFromXML = (T)srializer.Deserialize(existinglocalstream); }
break;
case 3:
//Both - compares last modified. If OneDrive, replaces local data then returns the object
var localinfo = await localfile.GetBasicPropertiesAsync();
var localtime = localinfo.DateModified;
var oneDtime = (DateTimeOffset)oneDItem.FileSystemInfo.LastModifiedDateTime;
switch (oneDtime > localtime)
{
case true:
using (var newlocalstream = await localfile.OpenStreamForWriteAsync())
{
using (var oneDStream = await _userDrive.Drive.Special.AppRoot.ItemWithPath(filepath).Content.Request().GetAsync())
{ oneDStream.CopyTo(newlocalstream); }
}
using (var newreadstream = await localfile.OpenStreamForReadAsync())
{ objectFromXML = (T)srializer.Deserialize(newreadstream); }
break;
case false:
using (var existinglocalstream = await localfile.OpenStreamForReadAsync())
{ objectFromXML = (T)srializer.Deserialize(existinglocalstream); }
break;
}
break;
}
return objectFromXML;
}
Synchronization requires a few different steps, some of which the OneDrive API will help you with, some of which you'll have to do yourself.
Change Detection
The first stage is obviously to detect whether anything has changed. The OneDrive API provides two mechanism to detect changes in the service:
Changes for individual files can be detected using a standard request with an If-None-Match:
await this.userDrive.Drive.Special.AppRoot.ItemWithPath(remotePath).Content.Request(new Option[] { new HeaderOption("If-None-Match", "etag") }).GetAsync();
If the file doesn't yet exist at all you'll get back a 404 Not Found.
Else if the file has not changed you'll get back a 304 Not Modified.
Else you'll get the current state of the file.
Changes for a hierarchy can be detected using the delta API:
await this.userDrive.Drive.Special.AppRoot.Delta(previousDeltaToken).Request().GetAsync();
This will return the current state for all items that changed since the last invocation of delta. If this is the first invocation, previousDeltaToken will be null and the API will return the current state for ALL items within the AppRoot. For each file in the response you'll need to make another round-trip to the service to get the content.
On the local side you'll need to enumerate all of the files of interest and compare the timestamps to determine if something has changed.
Obviously the previous steps require knowledge of the "last seen" state, and so your application will need to keep track of this in some form of database/data structure. I'd suggest tracking the following:
+------------------+---------------------------------------------------------------------------+
| Property | Why? |
+------------------+---------------------------------------------------------------------------+
| Local Path | You'll need this so that you can map a local file to its service identity |
| Remote Path | You'll need this if you plan to address the remote file by path |
| Remote Id | You'll need this if you plan to address the remote file by unique id |
| Hash | The hash representing the current state of the file |
| Local Timestamp | Needed to detect local changes |
| Remote Timestamp | Needed for conflict resolution |
| Remote ETag | Needed to detect remote changes |
+------------------+---------------------------------------------------------------------------+
Additionally, if using the delta approach you'll need to store the token value from the delta response. This is item independent, so would need to be stored in some global field.
Conflict Resolution
If changes were detected on both sides your app will need to go through a conflict resolution process. An app that lacks an understanding of the files being synced would need to either prompt the user for manual conflict resolution, or do something like fork the file so there are now two copies. However, apps that are dealing with custom file formats should have enough knowledge to effectively merge the files without any form of user interaction. What that entails is obviously completely dependent on the file being synced.
Apply Changes
The final step is to push the merged state to wherever is required (e.g. if the change was local then push remote, if the change was remote then push local, otherwise if the change was in both places push both places). It's important to make sure this step occurs in such a way as to avoid replacing content that was written after the "Change Detection" step has taken place. Locally you'd probably accomplish this by locking the file during the process, however you cannot do that with the remote file. Instead you'll want to use the etag value to make sure the service only accepts the request if the state is still what you expect:
await this.userDrive.Drive.Special.AppRoot.ItemWithPath(remotePath).Content.Request(new Option[] { new HeaderOption("If-Match", "etag") }).PutAsync(newContentStream);
Related
I have FileSystem watcher for a local directory. It's working fine. I want same to implement for FTP. Is there any way I can achieve it? I have checked many solutions but it's not clear.
Logic: Want to get files from FTP later than some timestamp.
Problem faced: Getting all files from FTP and then filtering the result is hitting the performance (used FtpWebRequest).
Is there any right way to do this? (WinSCP is on hold. Cant use it now.)
FileSystemWatcher oFsWatcher = new FileSystemWatcher();
OFSWatchers.Add(oFsWatcher);
oFsWatcher.Path = sFilePath;
oFsWatcher.Filter = string.IsNullOrWhiteSpace(sFileFilter) ? "*.*" : sFileFilter;
oFsWatcher.NotifyFilter = NotifyFilters.FileName;
oFsWatcher.EnableRaisingEvents = true;
oFsWatcher.IncludeSubdirectories = bIncludeSubdirectories;
oFsWatcher.Created += new FileSystemEventHandler(OFsWatcher_Created);
You cannot use the FileSystemWatcher or any other way, because the FTP protocol does not have any API to notify a client about changes in the remote directory.
All you can do is to periodically iterate the remote tree and find changes.
It's actually rather easy to implement, if you use an FTP client library that supports recursive listing of a remote tree. Unfortunately, the built-in .NET FTP client, the FtpWebRequest does not. But for example with WinSCP .NET assembly, you can use the Session.EnumerateRemoteFiles method.
See the article Watching for changes in SFTP/FTP server:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
List<string> prevFiles = null;
while (true)
{
// Collect file list
List<string> files =
session.EnumerateRemoteFiles(
"/remote/path", "*.*", EnumerationOptions.AllDirectories)
.Select(fileInfo => fileInfo.FullName)
.ToList();
if (prevFiles == null)
{
// In the first round, just print number of files found
Console.WriteLine("Found {0} files", files.Count);
}
else
{
// Then look for differences against the previous list
IEnumerable<string> added = files.Except(prevFiles);
if (added.Any())
{
Console.WriteLine("Added files:");
foreach (string path in added)
{
Console.WriteLine(path);
}
}
IEnumerable<string> removed = prevFiles.Except(files);
if (removed.Any())
{
Console.WriteLine("Removed files:");
foreach (string path in removed)
{
Console.WriteLine(path);
}
}
}
prevFiles = files;
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
}
(I'm the author of WinSCP)
Though, if you actually want to just download the changes, it's a way easier. Just use the Session.SynchronizeDirectories in the loop.
while (true)
{
SynchronizationResult result =
session.SynchronizeDirectories(
SynchronizationMode.Local, "/remote/path", #"C:\local\path", true);
result.Check();
// You can inspect result.Downloads for a list for updated files
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
This will update even modified files, not only new files.
Though using WinSCP .NET assembly from a web application might be problematic. If you do not want to use a 3rd party library, you have to do with limitations of the FtpWebRequest. For an example how to recursively list a remote directory tree with the FtpWebRequest, see my answer to List names of files in FTP directory and its subdirectories.
You have edited your question to say that you have performance problems with the solutions I've suggested. Though you have already asked a new question that covers this:
Get FTP file details based on datetime in C#
Unless you have access to the OS which hosts the service; it will be a bit harder.
FileSystemWatcher places a hook on the filesystem, which will notify your application as soon as something happened.
FTP command specifications does not have such a hook. Besides that it's always initiated by the client.
Therefor, to implement such logic you should periodical perform a NLST to list the FTP-directory contents and track the changes (or hashes, perhaps (MDTM)) yourself.
More info:
FTP return codes
FTP
I have got an alternative solution to do my functionality.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}
So I'm having a problem with automating my code to check-in files to TFS, and it's been driving me up the wall! Here is my code:
string location = AppDomain.CurrentDomain.BaseDirectory;
TfsTeamProjectCollection baseUserTpcConnection = new TfsTeamProjectCollection(uriToTeamProjectCollection);
IIdentityManagementService ims = baseUserTpcConnection.GetService<IIdentityManagementService>();
TeamFoundationIdentity identity = ims.ReadIdentity(IdentitySearchFactor.AccountName, #"PROD1\JR", MembershipQuery.None, ReadIdentityOptions.None);
TfsTeamProjectCollection impersonatedTpcConnection = new TfsTeamProjectCollection(uriToTeamProjectCollection, identity.Descriptor);
VersionControlServer sourceControl = impersonatedTpcConnection.GetService<VersionControlServer>();
Workspace workspace = sourceControl.CreateWorkspace("MyTempWorkspace", sourceControl.AuthorizedUser);
String topDir = null;
try
{
Directory.CreateDirectory(location + "TFS");
String localDir = location + "TFS";
workspace.Map("$/Automation/", localDir);
workspace.Get();
destinationFile = Path.Combine(localDir, Name + ".xml");
string SeconddestinationFile = Path.Combine(localDir, Name + ".ial");
bool check = sourceControl.ServerItemExists(destinationFile, ItemType.Any);
PendingChange[] pendingChanges;
File.Move(sourceFile, destinationFile);
File.Copy(destinationFile, sourceFile, true);
File.Move(SecondsourceFile, SeconddestinationFile);
File.Copy(SeconddestinationFile, SecondsourceFile, true);
if (check == false)
{
workspace.PendAdd(localDir,true);
pendingChanges = workspace.GetPendingChanges();
workspace.CheckIn(pendingChanges, Comments);
}
else
{
workspace.PendEdit(destinationFile);
pendingChanges = workspace.GetPendingChanges();
workspace.CheckIn(pendingChanges, Comments);
}
and the problem is that whenever it's NEW files (PendEdit works correctly when the files already exist in TFS) that my code is attempting to check in, and it runs through this code:
if (check == false)
{
workspace.PendAdd(localDir,true);
pendingChanges = workspace.GetPendingChanges();
workspace.CheckIn(pendingChanges, Comments);
}
The files, instead of being in the included changes in pending changes, are instead in the excluded changes like so:
and when the line that actually does the check-in runs, I'll get a "The array must contain at least one element" error, and the only way to fix it is to manually add those detected changes, and promote them to included changes, and I simply can't for the life of me figure out how to do that programatically though C#. If anyone has any guidance on what direction I should take for this, I would really appreciate it! Thank you!
Edit: I've also discovered another way to solve this by reconciling the folder, which also promotes the detected changes, but again the problem is I can't seem to figure out how to program that to do it automatically.
I know that running the visual studio developer command prompt, redirecting to the folder that this mapping is in, and the running "tf reconcile /promote" is one way, but I can only automate that as far as the /promote part, because that brings up a toolbox that a user would have to input into, which defeats the purpose of the automation. I'm at a loss.
Next Edit in response to TToni:
Next Edit in response to TToni:
I'm not entirely sure if I did this CreateWorkspaceParameters correctly (see picture 1), but this time it gave the same error, but the files were not even in the excluded portions. They just didn't show up anywhere in the pending changes (see picture 2).
Check this blog:
The workspace has a method GetPendingChangesWithCandidates, which actually gets all the “Excluded” changes. Code snippet is as below:
private void PendChangesAndCheckIn(string pathToWorkspace)
{
//Get Version Control Server object
VersionControlServer vs = collection.GetService(typeof
(VersionControlServer)) as VersionControlServer;
Workspace ws = vs.TryGetWorkspace(pathToWorkspace);
//Do Delete and Copy Actions to local path
//Create a item spec from the server Path
PendingChange[] candidateChanges = null;
string serverPath = ws.GetServerItemForLocalItem(pathToWorkspace);
List<ItemSpec> its = new List<ItemSpec>();
its.Add(new ItemSpec(serverPath, RecursionType.Full));
//get all candidate changes and promote them to included changes
ws.GetPendingChangesWithCandidates(its.ToArray(), true,
out candidateChanges);
foreach (var change in candidateChanges)
{
if (change.IsAdd)
{
ws.PendAdd(change.LocalItem);
}
else if (change.IsDelete)
{
ws.PendDelete(change.LocalItem);
}
}
//Check In all pending changes
ws.CheckIn(ws.GetPendingChanges(), "This is a comment");
}
I'm trying to find out if a user added new music to the Music folder on the phone since app was last used.
I try to do this by checking the DateModified of the Music folder (which updates correctly on the computer when adding new music to the phone):
async void GetModifiedDate ()
{
BasicProperties props = await KnownFolders.MusicLibrary.GetBasicPropertiesAsync();
Debug.WriteLine("DATEMODIFIED: " + props.DateModified.ToString());
}
Unfortunately this returns:
DATEMODIFIED: 1/1/1601 1:00:00 AM +01:00
Am I doing something wrong or is there another quick way to check if user added new music?
KnownFolders.MusicLibrary is a virtual location. Therefore I think, may be a problem in getting its properties.
The other problem is that DateModified may be a bad idea, as it may stay the same when user adds a file. You cannot relay on it. (some information). You can check it - when I've tried to move files in folders, their DateModified hadn't changed.
So in this case, I'm afraid you will have to list your files in MusicLibrary and then decide, what to save for future comparison. The sum of filesizes can be a good idea, hence there is a little chance that two different music files would be the same size. It depends also if you want to be notified if the user had moved file from one folder to another (total size won't change). If you want to ensure more you can remember the whole list of Tuple<file.FolderRelativeId, fileSize> (for ecample).
As FileQueries are not yet available for Windows Phone, you will have to retrive files recursively the simple code can look like this:
// first - a method to retrieve files from folder recursively
private async Task RetriveFilesInFolder(List<StorageFile> list, StorageFolder parent)
{
foreach (var item in await parent.GetFilesAsync()) list.Add(item);
foreach (var item in await parent.GetFoldersAsync()) await RetriveFilesInFolder(list, item);
}
private async Task<List<StorageFile>> GetFilesInMusic()
{
StorageFolder folder = KnownFolders.MusicLibrary;
List<StorageFile> listOfFiles = new List<StorageFile>();
await RetriveFilesInFolder(listOfFiles, folder);
return listOfFiles;
}
Once you have a list of your files, you can decide what to remember for further comparison upon next app launch.
You can check the folder size instead:
ulong musicFolderSize = (ulong)ApplicationData.Current.LocalSettings.Values["musicFolderSize"];
BasicProperties props = await KnownFolders.MusicLibrary.GetBasicPropertiesAsync();
if (props.Size != musicFolderSize)
{
// ...
ApplicationData.Current.LocalSettings.Values["musicFolderSize"] = props.Size;
}
Adding onto Romansz's answer:
The only way to guarantee that you know of a file change would be to track the files on the device and then compare if there is a change between launches.
A lazier way to get all of the files would be to use KnownFolders.MusicLibrary.GetFilesAsync(CommonFileQuery.OrderByName); which is a deep query by default. It will grab all of the files in one query, but can be really slow on massive directories. You then have to page through all of the files as shown
//cache the virtual storage folder for the loop
StorageFolder musicLibrary = KnownFolders.MusicLibrary;
uint stepSize= 50;
uint startIndex = 0;
while (true)
{
IReadOnlyList<StorageFile> files = await musicLibrary.GetFilesAsync(CommonFileQuery.OrderByName,startIndex,stepSize);
foreach (var file in files)
{
//compare to see if file is in your list
}
if (files.Count < stepSize) break;
startIndex += stepSize;
}
I'm programming an app that interact with dropbox by use DropNet API. I want to check if the folder is exist or not on dropbox in order to I will create one and upload file on it after that. Everything seen fine but if my folder is exist it throw exception. Like this:
if (isAccessToken)
{
byte[] bytes = File.ReadAllBytes(fileName);
try
{
string dropboxFolder = "/Public/DropboxManagement/Logs" + folder;
// I want to check if the dropboxFolder is exist here
_client.CreateFolder(dropboxFolder);
var upload = _client.UploadFile(dropboxFolder, fileName, bytes);
}
catch (DropNet.Exceptions.DropboxException ex) {
MessageBox.Show(ex.Response.Content);
}
}
I'm not familiar with dropnet, but looking at the source code, it appears you should be able to do this by using the GetMetaData() method off of your _client object. This method returns a MetaData object.
Example:
//gets contents at requested path
var metaData = _client.GetMetaData("/Public/DropboxManagement/Logs");
//without knowing how this API works, Path may be a full path and therefore need to check for "/Public/DropboxManagement/Logs" + folder
if (metaData.Contents.Any(c => c.Is_Dir && c.Path == folder)
{
//folder exists
}
I can't seem to find a solution to this issue. I'm trying to get my Compact Framework application on Windows Mobile 6 to have the ability to move a file on its local filesystem to another system.
Here's the solutions I'm aware of:
FTP - Problem with that is most of
the APIs are way to expensive to use.
HTTP PUT - As far as I have been able to find, I can't use anonymous PUT with IIS7, and that's the web server the system is running. (An extreme workaround for this would be to use a different web server to PUT the file, and have that other system transfer it to the IIS system).
Windows share - I would need authentication on the shares, and I haven't seen that a way to pass this authentication through windows mobile.
The last resort would be to require that the devices be cradled to transfer these files, but I'd really like to be able to have these files be transferred wirelessly.
FTP: define "too expensive". Do you mean performance or byte overhead or dollar cost? Here's a free one with source.
HTTP: IIS7 certainly supports hosting web services or custom IHttpHandlers. You could use either for a data upload pretty easily.
A Windows Share simply requires that you to P/Invoke the WNet APIs to map the share, but it's not terribly complex.
I ended up just passing information to a web server via a PHP script.
The options provided above just didn't work out for my situation.
Here's the gist of it. I've got some code in there with progress bars and various checks and handlers unrelated to simply sending a file, but I'm sure you can pick through it. I've removed my authentication code from both the C# and the PHP, but it shouldn't be too hard to roll your own, if necessary.
in C#:
/*
* Here's the short+sweet about how I'm doing this
* 1) Copy the file from mobile device to web server by querying PHP script with paramaters for each line
* 2) PHP script checks 1) If we got the whole data file 2) If this is a duplicate data file
* 3) If it is a duplicate, or we didn't get the whole thing, it goes away. The mobile
* device will hang on to it's data file in the first case (if it's duplicate it deletes it)
* to be tried again later
* 4) The server will then process the data files using a scheduled task/cron job at an appropriate time
*/
private void process_attempts()
{
Uri CheckUrl = new Uri("http://path/to/php/script?action=check");
WebRequest checkReq = WebRequest.Create(CheckUrl);
try
{
WebResponse CheckResp = checkReq.GetResponse();
CheckResp.Close();
}
catch
{
MessageBox.Show("Error! Connection not available. Please make sure you are online.");
this.Invoke(new Close(closeme));
}
StreamReader dataReader = File.OpenText(datafile);
String line = null;
line = dataReader.ReadLine();
while (line != null)
{
Uri Url = new Uri("http://path/to/php/script?action=process&line=" + line);
WebRequest WebReq = WebRequest.Create(Url);
try
{
WebResponse Resp = WebReq.GetResponse();
Resp.Close();
}
catch
{
MessageBox.Show("Error! Connection not available. Please make sure you are online.");
this.Invoke(new Close(closeme));
return;
}
try
{
process_bar.Invoke(new SetInt(SetBarValue), new object[] { processed });
}
catch { }
process_num.Invoke(new SetString(SetNumValue), new object[] { processed + "/" + attempts });
processed++;
line = dataReader.ReadLine();
}
dataReader.Close();
Uri Url2 = new Uri("http://path/to/php/script?action=finalize&lines=" + attempts);
Boolean finalized = false;
WebRequest WebReq2 = WebRequest.Create(Url2);
try
{
WebResponse Resp = WebReq2.GetResponse();
Resp.Close();
finalized = true;
}
catch
{
MessageBox.Show("Error! Connection not available. Please make sure you are online.");
this.Invoke(new Close(closeme));
finalized = false;
}
MessageBox.Show("Done!");
this.Invoke(new Close(closeme));
}
In PHP (thoroughly commented for your benefit!):
<?php
//Get the GET'd values from the C#
//The current line being processed
$line = $_GET['line'];
//Which action we are doing
$action = $_GET['action'];
//# of lines in the source file
$totalLines = $_GET['lines'];
//If we are processing the line, open the data file, and append this new line and a newline.
if($action == "process"){
$dataFile = "tempdata/SOME_KIND_OF_UNIQUE_FILENAME.dat";
//open the file
$fh = fopen($dataFile, 'a');
//Write the line, and a newline to the file
fwrite($fh, $line."\r\n");
//Close the file
fclose($fh);
//Exit the script
exit();
}
//If we are done processing the original file from the C# application, make sure the number of lines in the new file matches that in the
//file we are transferring. An expansion of this could be to compare some kind of hash function value of both files...
if($action == "finalize"){
$dataFile = "tempdata/SOME_KIND_OF_UNIQUE_FILENAME.dat";
//Count the number of lines in the new file
$lines = count(file($dataFile));
//If the new file and the old file have the same number of lines...
if($lines == $totalLines){
//File has the matching number of lines, good enough for me over TCP.
//We should move or rename this file.
}else{
//File does NOT have the same number of lines as the source file.
}
exit();
}
if($action == "check"){
//If a file with this unique file name already exists, delete it.
$dataFile = "tempdata/SOME_KIND_OF_UNIQUE_FILENAME.dat";
if(file_exists($dataFile)){
unlink($dataFile);
}
}
?>