i'm uploading a document to sharepoint.. however i would like to provide a custom name rather than it inherit the name of the file which im uploading.
my code was based on this solution: http://www.codeproject.com/Articles/103503/How-to-upload-download-a-document-in-SharePoint-20.aspx
however this doesnt work.
Additionally, i would also like to provide a title of the file:
so i wanted to update the title:
uploadFile.ListItemAllFields.FieldValues["Title"] = "my custom title";
However, once the file has completed its upload..i login to sharepoint and notice the title hasnt been applied.
how can i intergrate uploading the file and applying a new name?
many thanks,
EDIT:
using (var clientContext = GetNewContext())
{
var uploadLocation = string.Format("{0}{1}/{2}", SiteUrl, Helpers.ListNames.RequestedDocuments, Path.GetFileName(document));
//Get Document List
var documentslist = clientContext.Web.Lists.GetByTitle(Helpers.ListNames.RequestedDocuments);
var fileCreationInformation = new FileCreationInformation
{
Content = System.IO.File.ReadAllBytes(document), //Assign to content byte[] i.e. documentStream
Overwrite = true, //Allow owerwrite of document
Url = uploadLocation //Upload URL,
};
var uploadFile = documentslist.RootFolder.Files.Add(fileCreationInformation);
uploadFile.ListItemAllFields.FieldValues["Title"] = title;
uploadFile.ListItemAllFields.Update();
clientContext.ExecuteQuery();
}
site.SubmitChanges(ConflictMode.FailOnFirstConflict, true);
You are missing a call to clientContext.Load after you add the file to the Files collection. See these blog posts for more information:
https://www.c-sharpcorner.com/code/965/programmatically-upload-document-using-client-object-model-in-sharepoint.aspx
https://zimmergren.net/sp-2010-uploading-files-using-the-client-om-in-sharepoint-2010/
This code sample is from the first blog post linked above:
public Boolean UploadDocument(String fileName, String filePath, List metaDataList)
{
SP.ClientContext ctx = new SP.ClientContext("http: //yoursharepointURL");
Web web = ctx.Web;
FileCreationInformation newFile = new FileCreationInformation();
newFile.Content = System.IO.File.ReadAllBytes(#"C: \TestFile.doc");
newFile.Url = " / " + fileName;
List docs = web.Lists.GetByTitle("Shared Documents");
Microsoft.SharePoint.Client.File uploadFile = docs.RootFolder.Files.Add(newFile);
context.Load(uploadFile);
context.ExecuteQuery();
SPClient.ListItem item = uploadFile.ListItemAllFields;
//Set the metadata
string docTitle = string.Empty;
item["Title"] = docTitle;
item.Update();
context.ExecuteQuery();
}
Are you calling Update after setting the field values?
uploadFile.ListItemAllFields.Update();
instead of setting:
uploadFile.ListItemAllFields.FieldValues["Title"] = title;
uploadFile.ListItemAllFields.Update();
set it as follows:
uploadFile.ListItemAllFields["Title"] = title;
uploadFile.ListItemAllFields.Update();
Related
I need to download text file from FileCabinet in NetSuite. I am able to search for all files in a folder and get back the file size, name and URL. But when I check the 'content' property, it is NULL. How can I download the file locally?
I tried using the URL to download the file using WebClient, but it returns 403 which makes sense.
var result = Client.Service.search(fileSearch);
var recordList = (Record[])result.recordList;
if (recordList != null && recordList.Length != 0)
{
foreach (var item in recordList)
{
var file = (com.netsuite.webservices.File)item;
int fileSize = (int)file.fileSize; // Returns the correct file size
byte[] fileContent = file.content; // NULL reference ??
Console.WriteLine(file.url + " ==== " + file.name );
// How to download the File from the url above??
// Can't do this, 403 error, below client dont use the same security context
//using (var client = new WebClient())
//{
// client.UseDefaultCredentials = false;
// client.DownloadFile(baseUrl + file.url, file.name);
//}
}
}
I expected 'content' to contain the file content.
When you execute a search, the search results do not include the contents of the file, but you DO have the file id. Below is an extension method on the NetSuite service to get a file by it's id:
public static NetSuite.File GetFileById(this NetSuiteService ns, int fileId)
{
var file = new NetSuite.File();
var response = ns.get(new RecordRef()
{
type = RecordType.file,
internalId = fileId.ToString(),
typeSpecified = true
});
if (response.status.isSuccess)
{
file = response.record as File;
}
return file;
}
var f = ns.GetFileById(3946);
var path = Path.Combine(Directory.GetCurrentDirectory(), f.name);
var contents = f.content;
System.IO.File.WriteAllBytes(path, contents);
Console.WriteLine($"Downloaded {f.name}");
I have refereed to the other examples on this website, but found a major difference in my method. (Please be patient)
I am trying to iterate over a directory of files and upload each file as an attachment and associate to a user story.
I am only able to attach 1 file for a user story as of now.
I see that every attachment has to be encoded to a base 64 string and it must have a the size in bytes.
Here is my code so far:
public void createUsWithAttachmentList(string workspace, string project, string userStoryName, string userStoryDescription)
{
//authentication
this.EnsureRallyIsAuthenticated();
//DynamicJSONObject for AttachmentContent
DynamicJsonObject myAttachmentContent = new DynamicJsonObject();
//Length calculated from Base64String converted back
int imageNumberBytes = 0;
//Userstory setup
DynamicJsonObject toCreate = new DynamicJsonObject();
toCreate["Workspace"] = workspace;
toCreate["Project"] = project;
toCreate["Name"] = userStoryName;
toCreate["Description"] = userStoryDescription;
//Trying to get a list of all the file paths within a given directory, this directory would contain .png files that need to be associated to a user story.
string[] attachmentPath = Directory.GetFiles("C:\\Users\\user\\Desktop\\RallyAttachments");
This foreach loop is confusing. I am trying to iterate over each file in the directory in order to convert it into a base64 string, and at the same time acquire the number of bytes for each file as an int.
foreach (string fileName in attachmentPath)
{
Image myImage = Image.FromFile(fileName);
string imageBase64String = imageToBase64(myImage, System.Drawing.Imaging.ImageFormat.Png);
imageNumberBytes = Convert.FromBase64String(imageBase64String).Length;
//I am stuck here to be exact because there are multiple imageBase64Strings due to the collection of files located inside the directory. AND the below line is wrong because I have a list of imageBase64Strings that were generated from iterating through the string[] attachmentPath.
myAttachmentContent[RallyField.content] = imageBase64String;
}
try
{
//create user story
CreateResult createUserStory = _api.Create(RallyField.attachmentContent, myAttachmentContent);
//create attachment
CreateResult myAttachmentContentCreateResult = _api.Create(RallyField.attachmentContent, myAttachmentContent);
String myAttachmentContentRef = myAttachmentContentCreateResult.Reference;
//DynamicJSONObject for Attachment Container
//I assume I would need a separate container for each file in my directory containing the attachments.
DynamicJsonObject myAttachment = new DynamicJsonObject();
myAttachment["Artifact"] = createUserStory.Reference;
myAttachment["Content"] = myAttachmentContentRef;
myAttachment["Name"] = "AttachmentFromREST.png";
myAttachment["Description"] = "Email Attachment";
myAttachment["ContentType"] = "image/png";
myAttachment["Size"] = imageNumberBytes;
//create & associate the attachment
CreateResult myAttachmentCreateResult = _api.Create(RallyField.attachment, myAttachment);
Console.WriteLine("Created User Story: " + createUserStory.Reference);
}
catch (WebException e)
{
Console.WriteLine(e.Message);
}
}
Note: I am planning on extending this method to support multiple file types, and I thing I would need to get the file type of each file in the directory and proceed accordingly.
Any ideas on how to go about writing this?
You've got all the parts implemented- we just need to move it around a little bit. Create the story once at the beginning, and then each time through the loop make a new AttachmentContent and a new Attachment for each file.
public void createUsWithAttachmentList(string workspace, string project, string userStoryName, string userStoryDescription)
{
//authentication
this.EnsureRallyIsAuthenticated();
//Userstory setup
DynamicJsonObject toCreate = new DynamicJsonObject();
toCreate["Workspace"] = workspace;
toCreate["Project"] = project;
toCreate["Name"] = userStoryName;
toCreate["Description"] = userStoryDescription;
//Create the story first
try
{
//create user story
CreateResult createUserStory = _api.Create(RallyField.userStory, toCreate);
//now loop over each file
string[] attachmentPath = Directory.GetFiles("C:\\Users\\user\\Desktop\\RallyAttachments");
foreach (string fileName in attachmentPath)
{
//DynamicJSONObject for AttachmentContent
DynamicJsonObject myAttachmentContent = new DynamicJsonObject();
Image myImage = Image.FromFile(fileName);
string imageBase64String = imageToBase64(myImage, System.Drawing.Imaging.ImageFormat.Png);
int imageNumberBytes = Convert.FromBase64String(imageBase64String).Length;
myAttachmentContent[RallyField.content] = imageBase64String;
//create the AttachmentConent
CreateResult myAttachmentContentCreateResult = _api.Create(RallyField.attachmentContent, myAttachmentContent);
String myAttachmentContentRef = myAttachmentContentCreateResult.Reference;
//create an Attachment to associate to story
DynamicJsonObject myAttachment = new DynamicJsonObject();
myAttachment["Artifact"] = createUserStory.Reference;
myAttachment["Content"] = myAttachmentContentRef;
myAttachment["Name"] = "AttachmentFromREST.png";
myAttachment["Description"] = "Email Attachment";
myAttachment["ContentType"] = "image/png";
myAttachment["Size"] = imageNumberBytes;
//create & associate the attachment
CreateResult myAttachmentCreateResult = _api.Create(RallyField.attachment, myAttachment);
}
}
catch (WebException e)
{
Console.WriteLine(e.Message);
}
}
I am trying to upload multiple file to a Document Library and also update its coloumn values.
List(Doc Lib) already exists but I am stuck with uploadinf the file
I've tried these methods
using lists.asmx
NetworkCredential credentials = new NetworkCredential("user", "Pass", "domain");
#region ListWebService
ListService.Lists listService = new ListService.Lists();
listService.Credentials = credentials;
List list = cc.Web.Lists.GetByTitle(library);
listService.Url = cc.Url + "/_vti_bin/lists.asmx";
try
{
FileStream fStream = System.IO.File.OpenRead(filePath);
string fName = fStream.Name.Substring(3);
byte[] contents = new byte[fStream.Length];
fStream.Read(contents, 0, (int)fStream.Length);
fStream.Close();
string attach = listService.AddAttachment(library, itemId.ToString(), Path.GetFileName(filePath), contents);
}
#endregion
catch (System.Web.Services.Protocols.SoapException ex)
{
CSVWriter("Message:\n" + ex.Message + "\nDetail:\n" +
ex.Detail.InnerText + "\nStackTrace:\n" + ex.StackTrace, LogReport);
}
It gives a error ServerException :To add an item to a document library, use SPFileCollection.Add() on AddAttachment()
Using
List lib = cc.Web.Lists.GetByTitle("TestLib");
FileCreationInformation fileInfo = new FileCreationInformation();
fileInfo.Content = System.IO.File.ReadAllBytes("C:\\Users\\AJohn\\Desktop\\sample.docx");
fileInfo.Url = "https://serverm/sites/Testing1/TestLib/sample.docx";
fileInfo.Overwrite = true;
Microsoft.SharePoint.Client.File upFile = lib.RootFolder.Files.Add(fileInfo);
cc.Load(upFile);
cc.ExecuteQuery();
I was able to upload once using this method, but now I am getting ServerException :To add an item to a document library, use SPFileCollection.Add() on cc.ExecuteQuery()
But if at all this method works, what I want is that I should update the coloumn values related to this file. In first method I get item.ID so from there I can update the Coloumn Values
Regarding the second method, the following example demonstrates how to upload a file into Documents library and set it's properties (e.g. Category text field)
using (var ctx = new ClientContext(webUri))
{
var targetList = ctx.Web.Lists.GetByTitle("Documents");
var fileInfo = new FileCreationInformation
{
Url = System.IO.Path.GetFileName(sourcePath),
Content = System.IO.File.ReadAllBytes(sourcePath),
Overwrite = true
};
var file = targetList.RootFolder.Files.Add(fileInfo);
var item = file.ListItemAllFields;
item["Category"] = "User Guide";
item.Update();
ctx.ExecuteQuery();
}
I'm uploading file to Sharepoint server using this code:
ClientOM.File uploadFile = null;
try {
string fileRef = serverRelativeURL + msg.Message.FileName;
FileCreationInformation fileCreationInformation = new FileCreationInformation() {
Content = msg.Content,
Url = fileRef,
Overwrite = true
};
uploadFile = _currentList.RootFolder.Files.Add(fileCreationInformation);
_currentContext.ExecuteQuery();
And file uploaded. But on server we have event that adds some random string to the file title. So fileRef is not relevant after upload.
And we need to set the Author of the file. For this we have to retrieve file and update this propery. I do it with this sample:
string fileName = serverRelativeURL + msg.Message.FileName;
uploadFile = _currentContext.Web.GetFileByServerRelativeUrl(fileName);
_currentContext.Load(uploadFile);
uploadFile.ListItemAllFields["Author"] = _currentUser;
uploadFile.ListItemAllFields["Editor"] = _currentUser;
uploadFile.ListItemAllFields.Update();
_currentContext.ExecuteQuery();
And on ExecuteQuery() I get an Exception "File not found". But if I copy path from Sharepoint (with that random string) everything works Ok.
So the question is: Is there other way to retrieve file? By id for example? because when we uploading file, instance "uploadFile" does not have much of useful information.
Method 1:
Keep track of the filename, and then use this code to retrieve it directly.
public FileInformation GetFileFromAttachment(int itemId, string filename)
{
FileInformation file = null;
//continue here
if (new FileInfo(filename).Name != null)
{
CSOMUtils.ExecuteInNewContext(QueryConfig.siteUrl, delegate(ClientContext clientContext)
{
clientContext.Credentials = QueryConfig.credentials;
clientContext.Load(clientContext.Web, l => l.ServerRelativeUrl);
clientContext.ExecuteQuery();
List oList = clientContext.Web.Lists.GetByTitle(ListName);
clientContext.ExecuteQuery();
string url = string.Format("{0}/Lists/{1}/Attachments/{2}/{3}",
clientContext.Web.ServerRelativeUrl,
ListName,
itemId,
filename);
var f = clientContext.Web.GetFileByServerRelativeUrl(url);
clientContext.Load(f);
clientContext.ExecuteQuery();
file = File.OpenBinaryDirect(clientContext, f.ServerRelativeUrl);
});
}
return file;
}
Method 2:
You can use ServerRelativeUrl to get the folder containing all the attachments.
https://msdn.microsoft.com/library/office/microsoft.sharepoint.client.folder.serverrelativeurl.aspx
https://sharepoint.stackexchange.com/questions/132008/reliably-get-attachments-for-list-item
We are having someone manually load weekly generated excel spreadsheets into SharePoint. I'm sure there is a way to automate this. I don't know a lot about SharePoint, and maybe it's really as simple as just knowing the folder SharePoint is moving the files to and copying them directly there. Or maybe it requires some programming to get it automatically load new files put in a given directory into SharePoint.
Either way, I would just like someone to point me in the right direction here.
You will need to upload the file using the copy web service in SharePoint. I am not sure what version of SharePoint you are running but I am assuming 2007. Here is a sample project.
public void UploadFile(string destinationFolderPath,
byte[] fileBytes,
string fileName,
bool overwrite,
string sourceFileUrl,
string lastVersionUrl)
{
List<Sharepoint.FieldInformation> fields = new List<Sharepoint.FieldInformation>();
Sharepoint.FieldInformation fieldInfo;
fieldInfo = new Sharepoint.FieldInformation();
fieldInfo.Id = Microsoft.SharePoint.SPBuiltInFieldId.Title;
fieldInfo.Value = "New title";
fieldInfo.DisplayName = "Title";
fieldInfo.Type = YetAnotherMigrationTool.Library.SP2007.Sharepoint.FieldType.Text;
fieldInfo.InternalName = "Title";
fields.Add(fieldInfo);
string[] url;
if (string.IsNullOrEmpty(destinationFolderPath))
url = new string[] { string.Format("{0}/{1}/{2}", _siteUrl, _name, fileName) };
else
url = new string[] { string.Format("{0}/{1}/{2}{3}", _siteUrl, _name, destinationFolderPath, fileName) };
Sharepoint.CopyResult[] result;
Sharepoint.Copy service = new Sharepoint.Copy();
service.Url = _siteUrl + "/_vti_bin/Copy.asmx";
service.Credentials = new NetworkCredential(Settings.Instance.User, Settings.Instance.Password);
service.Timeout = 600000;
uint documentId = service.CopyIntoItems(sourceFileUrl, url, fields.ToArray(), fileBytes, out result);
}
public void SetContentType(List<string> ids, string contentType)
{
ListsService.Lists service = new YetAnotherMigrationTool.Library.SP2007.ListsService.Lists();
service.Url = _siteUrl + "/_vti_bin/Lists.asmx";
service.Credentials = new NetworkCredential(Settings.Instance.User, Settings.Instance.Password);
string strBatch = "";
for (int i = 1; i <= ids.Count; i++)
{
strBatch += #"<Method ID='"+i.ToString()+#"' Cmd='Update'><Field Name='ID'>" + ids[i-1] + "</Field><Field Name='ContentType'>"+contentType+"</Field></Method>";
}
XmlDocument xmlDoc = new XmlDocument();
XmlElement elBatch = xmlDoc.CreateElement("Batch");
elBatch.SetAttribute("OnError", "Continue");
elBatch.SetAttribute("ListVersion", "10");
elBatch.SetAttribute("ViewName", "");
elBatch.InnerXml = strBatch;
result = service.UpdateListItems(_name, elBatch);
}
You could write a PowerShell script that copies the document into the document library via WebDav:
Assuming you have your document library at http://server/SomeWeb/DocumentLibrary/Folder:
copy-item somesheet.xlsx \\server\SomeWeb\DocumentLibrary\Folder