i want to display value of sharepoint people/group value in people editor(web part) when the page is loaded. This is the code that i use to get the value displayed in web part
if(SPContext .Current .ListItem .ID >= 1)
using (SPSite site = new SPSite("sitename"))
{
using (SPWeb web = site.OpenWeb())
{
var id = SPContext.Current.ListItem.ID;
SPList lists = web.Lists["DDClist"];
SPListItem item = lists.GetItemById(id);
{
string test = Convert.ToString(item["Project No"]);
tb_pno.Text = test;
string test2 = Convert.ToString(item["Project Title"]);
tb_pname.Text = test2;
string test3 = Convert.ToString(item["DDC No"]);
tb_idcno.Text = test3;
string test4 = Convert.ToString(item["Date In"]);
TextBox3.Text = test4;
}
}
}
is there a way to do the same thing with people editor?
This is all a little tricky; when I've had to do it before, I use the following to get SPUser object out of a field:
SPUser singleUser = new SPFieldUserValue(
item.Web, item["Single User"] as string).User;
SPUser[] multipleUsers = ((SPFieldUserValueCollection)item["MultipleUsers"])
.Cast<SPFieldUserValue>().Select(f => f.User);
I'm not sure why one user is stored as a string, but multiple users are stored as a specific object; it may also not be consistent in this so you might have to debug a bit and see what the type in your field is.
Once you have these SPUsers, you can populate your PeopleEditor control
using the account names as follows (quite long-winded):
ArrayList entityArrayList = new ArrayList();
foreach(SPUser user in multipleUsers) // or just once for a single user
{
PickerEntity entity = new PickerEntity;
entity.Key = user.LoginName;
entity = peMyPeople.ValidateEntity(entity);
entityArrayList.Add(entity);
}
peMyPeople.UpdateEntities(entityArrayList);
This also performs validation of the users of some kind.
If the page this control appears on may be posted-back, you need the following to be done during the postback in order for the values to be correctly roundtripped; I put it in PreRender but it could happen elsewhere in the lifecycle:
protected override void OnPreRender(EventArgs e)
{
if (IsPostBack)
{
var csa = peMyPeople.CommaSeparatedAccounts;
csa = peMyPeople.CommaSeparatedAccounts;
}
}
If you want to check any error messages that the control generates for you (if the user input is incorrect), you need to have done this switchout already:
var csa = usrBankSponsor.CommaSeparatedAccounts;
csa = usrOtherBankParties.CommaSeparatedAccounts;
//ErrorMessage is incorrect if you haven't done the above
if (!String.IsNullOrEmpty(usrBankSponsor.ErrorMessage))
{
...
}
It's really not very nice and there may be a much better way of handling it, but this is the result of my experience so far so hopefully it will save you some time.
I'm new to SharePoint. I'm trying to programmatically create a wiki page within the Pages library of an Enterprise Wiki site in SharePoint 2010. Here is my code:
using (SPSite site = new SPSite(SPContext.Current.Web.Url))
{
SPWeb rootWeb = site.RootWeb;
rootWeb.AllowUnsafeUpdates = true;
SPList wiki = rootWeb.Lists["Pages"];
SPFolder rootFolder = wiki.RootFolder;
SPFile wikiPage = rootFolder.Files.Add(String.Format("{0}/{1}", rootFolder.ServerRelativeUrl, "MyWikiPage.aspx"), SPTemplateFileType.WikiPage);
SPListItem wikiItem = wikiPage.Item;
wikiItem["PublishingPageContent"] = "my demo content";
wikiItem.UpdateOverwriteVersion();
rootWeb.AllowUnsafeUpdates = false;
}
The page gets created but the problem is that the created page is not editable and the demo content is not inserted. When opened in edit mode, no content space is available and edit options are greyed out.
I have also tried setting the default content like this:
wikiItem[SPBuiltInFieldId.WikiField] = "my demo content";
But that gives an invalid field error.
I have also tried creating the page with this line of code instead:
SPFile wikiPage = SPUtility.CreateNewWikiPage(wiki, String.Format("{0}/{1}", rootFolder.ServerRelativeUrl, "MyWikiPage.aspx"));
But the result is exactly the same.
I have confirmed that "SharePoint Server Publishing" feature is turned on for the site and "SharePoint Server Publishing Infrastructure" feature is turned on for the site collection.
Please help.
With help from my other thread on sharepoint.stackexchange.com, I came up with this solution:
Instead of targeting the Pages library with regular wiki manipulation routines, we need to create a new Publishing Page and update Content type properties accordingly.
For the benefit of others, here's the code that worked for me:
using (SPSite site = new SPSite(SPContext.Current.Web.Url))
{
SPWeb rootWeb = site.RootWeb;
rootWeb.AllowUnsafeUpdates = true;
SPList wiki = rootWeb.Lists["Pages"];
String url = wiki.RootFolder.ServerRelativeUrl.ToString();
PublishingSite pubSite = new PublishingSite(rootWeb.Site);
string pageLayoutName = "EnterpriseWiki.aspx"; //Page Layout Name
string layoutURL = rootWeb.Url + "/_catalogs/masterpage/" + pageLayoutName;
PageLayout layout = pubSite.PageLayouts[layoutURL];
PublishingWeb publishingWeb = PublishingWeb.GetPublishingWeb(rootWeb);
PublishingPage newWikiPage;
string myWikiPage = "MyWikiPage.aspx"; //Page name
newWikiPage = publishingWeb.GetPublishingPages().Add(myWikiPage, layout);
newWikiPage.Title = "My Wiki Page";
newWikiPage.Update();
rootWeb.AllowUnsafeUpdates = false;
}
How to create wiki page using SSOM
C#
/// <summary>
/// Create Wiki Page
/// </summary>
/// <param name="wikiPages"></param>
/// <param name="pageName"></param>
/// <param name="pageContent"></param>
/// <returns></returns>
public static SPListItem CreateWikiPage(SPList wikiPages, string pageName, string pageContent)
{
var web = wikiPages.ParentWeb;
var pSite = new Microsoft.SharePoint.Publishing.PublishingSite(web.Site);
var pageLayoutUrl = SPUtility.ConcatUrls(web.Site.Url,"/_catalogs/masterpage/EnterpriseWiki.aspx");
var pageLayout = pSite.PageLayouts[pageLayoutUrl];
var pWeb = Microsoft.SharePoint.Publishing.PublishingWeb.GetPublishingWeb(web);
var wikiPage = pWeb.GetPublishingPages().Add(pageName, pageLayout);
var wikiItem = wikiPage.ListItem;
wikiItem["PublishingPageContent"] = pageContent;
wikiItem.Update();
return wikiItem;
}
PowerShell
if ((Get-PSSnapin -Name Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue) -eq $null)
{
Add-PsSnapin Microsoft.SharePoint.PowerShell
}
Function Create-WikiPage([string]$WebUrl,[string]$PageName,[string]$PageContent)
{
$web = Get-SPWeb $WebUrl
$wikiPages = $web.Lists["Pages"]
$pSite = New-Object Microsoft.SharePoint.Publishing.PublishingSite($web.Site)
$pageLayoutUrl = $web.Site.Url + "/_catalogs/masterpage/EnterpriseWiki.aspx"
$pageLayout = $pSite.PageLayouts[$pageLayoutUrl]
$pWeb = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($web)
$wikiPage = $pWeb.GetPublishingPages().Add($PageName, $pageLayout);
$wikiPage.Title = [System.IO.Path]::GetFileNameWithoutExtension($PageName)
$wikiItem = $wikiPage.ListItem
$wikiItem["PublishingPageContent"] = $pageContent
$wikiItem.Update()
}
Usage
Create-WikiPage -WebUrl "http://contoso.intranet.sp.dev/faq/" -PageName "FAQ.aspx" -PageContent "Welcome to FAQ"
This question is old, but I scoured the internet for how to do this in Graph Api, and I came up with nothing. I wanted to post my method somewhere so the next poor soul won't have to figure it out on their own.
How to create SharePoint Wiki Pages in Graph Api
The problem here is that wiki pages are .aspx files in a site's drive and are also ListItems in the "Pages" list. If you just try to create a List Item:
using Azure.Identity;
using Microsoft.Graph;
using System.Security.Cryptography.X509Certificates;
const string TENANT_ID = "<YourTenantId>";
const string APP_ID = "<YourAppId>";
const string CERTIFICATE_THUMBPRINT = "<YourCertificateThumbprint>";
const string SITE_URL = "<YourSiteUrl>";
//Get certificate
X509Store certStore = new(StoreName.My, StoreLocation.CurrentUser, OpenFlags.ReadOnly);
X509Certificate2 cert = certStore.Certificates.First(cert => cert.Thumbprint == CERTIFICATE_THUMBPRINT);
//Get a client using your certificate
ClientCertificateCredential certCredential = new(TENANT_ID, APP_ID, cert);
GraphServiceClient graphClient = new(certCredential);
//Get the site by the url and get the "Pages" list in that site
Uri docSiteUrl = new(SITE_URL);
Site site = await graphClient.Sites[$"{docSiteUrl.Host}:{docSiteUrl.AbsolutePath}"].Request().GetAsync();
var pagesLists = await graphClient.Sites[site.Id].Lists.Request().Filter($"(displayName eq 'Pages')").Top(2).GetAsync();
List pagesList = pagesLists.Single();
//Try to add a new list item
ListItem itemToAdd = new()
{
Fields = new()
{
AdditionalData = new Dictionary<string, object>()
{
{ "PublishingPageContent", "<p>Hello, World!</p>" },
{ "PublishingPageLayout", "/_catalogs/masterpage/EnterpriseWiki.aspx, Basic Page" }
}
}
};
var itemCreated = await graphClient.Sites[site.Id].Lists[pagesList.Id].Items.Request().AddAsync(itemToAdd);
Then you'll get the following error:
Microsoft.Graph.ServiceException: Files and folders should only be added to a DocumentLibrary via the OneDrive API
OK, so let's go ahead and create the item in the drive you'd say:
//Create a drive item
var drives = await graphClient.Sites[site.Id].Drives.Request().GetAsync();
var pagesDrive = drives.Where(drv => drv.Name == "Pages").Single();
using var stream = new MemoryStream(Encoding.UTF8.GetBytes("<p>Hello, World!</p>"));
var driveItem = await graphClient.Sites[site.Id].Drives[pagesDrive.Id].Root.ItemWithPath("hello.aspx").Content.Request().PutAsync<DriveItem>(stream);
And hey presto, it worked! Or, well, it ran. Then you go to SharePoint and try to open it and you realize it's just a page. It's not actually a Wiki Page. No problem, let's just go ahead and change that page layout:
//Update the list item
var listItem = await graphClient.Sites[site.Id].Drives[pagesDrive.Id].Items[driveItem.Id].ListItem.Request().GetAsync();
FieldValueSet fieldstoUpdate = new()
{
AdditionalData = new Dictionary<string, object>()
{
{ "PublishingPageContent", "<p>Hello, World!</p>" },
{ "PublishingPageLayout", "/_catalogs/masterpage/EnterpriseWiki.aspx, Basic Page" }
}
};
var updatedFields = await graphClient.Sites[site.Id].Lists[pagesList.Id].Items[listItem.Id].Fields.Request().UpdateAsync(fieldstoUpdate);
But unfortunately we get:
Microsoft.Graph.ServiceException: Field 'PublishingPageLayout' is read-only
So are we up a proverbial creek? We can't create the list item directly, we can't edit the content type after it exists, and we can only upload the file name and contents to the drive endpoint.
The only way we could set the publishing properties is if somehow SharePoint were putting the publishing engine's metadata inside the document contents itself. Let's just go ahead and manually create an empty wiki page, and download the file to see if there's any -OH MY GOD! LOOK AT ALL THAT METADATA!
<%# Page Inherits="Microsoft.SharePoint.Publishing.TemplateRedirectionPage,Microsoft.SharePoint.Publishing,Version=16.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c" %> <%# Reference VirtualPath="~TemplatePageUrl" %> <%# Reference VirtualPath="~masterurl/custom.master" %><%# Register Tagprefix="SharePoint" Namespace="Microsoft.SharePoint.WebControls" Assembly="Microsoft.SharePoint, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>
<html xmlns:mso="urn:schemas-microsoft-com:office:office" xmlns:msdt="uuid:C2F41010-65B3-11d1-A29F-00AA00C14882"><head>
<!--[if gte mso 9]><SharePoint:CTFieldRefs runat=server Prefix="mso:" FieldList="FileLeafRef,Comments,PublishingStartDate,PublishingExpirationDate,PublishingContactEmail,PublishingContactName,PublishingContactPicture,PublishingPageLayout,PublishingVariationGroupID,PublishingVariationRelationshipLinkFieldID,PublishingRollupImage,Audience,PublishingIsFurlPage,SeoBrowserTitle,SeoMetaDescription,SeoKeywords,RobotsNoIndex,PublishingPageContent,AverageRating,RatingCount,Ratings,e1a5b98cdd71426dacb6e478c7a5882f,TaxCatchAllLabel,LikesCount,Suggested_x0020_Reading"><xml>
<mso:CustomDocumentProperties>
<mso:PublishingContact msdt:dt="string">19</mso:PublishingContact>
<mso:PublishingIsFurlPage msdt:dt="string">0</mso:PublishingIsFurlPage>
<mso:display_urn_x003a_schemas-microsoft-com_x003a_office_x003a_office_x0023_PublishingContact msdt:dt="string">Kubat, Luke</mso:display_urn_x003a_schemas-microsoft-com_x003a_office_x003a_office_x0023_PublishingContact>
<mso:PublishingContactPicture msdt:dt="string"></mso:PublishingContactPicture>
<mso:RobotsNoIndex msdt:dt="string">0</mso:RobotsNoIndex>
<mso:PublishingContactName msdt:dt="string"></mso:PublishingContactName>
<mso:PublishingPageLayoutName msdt:dt="string">EnterpriseWiki.aspx</mso:PublishingPageLayoutName>
<mso:Comments msdt:dt="string"></mso:Comments>
<mso:PublishingContactEmail msdt:dt="string"></mso:PublishingContactEmail>
<mso:PublishingPageLayout msdt:dt="string">YOUR_SITE_HERE/_catalogs/masterpage/EnterpriseWiki.aspx, Basic Page</mso:PublishingPageLayout>
<mso:TaskStatus msdt:dt="string">Not Started</mso:TaskStatus>
<mso:PublishingPageContent msdt:dt="string"><p>!</p></mso:PublishingPageContent>
<mso:RequiresRouting msdt:dt="string">False</mso:RequiresRouting>
</mso:CustomDocumentProperties>
</xml></SharePoint:CTFieldRefs><![endif]-->
<title>Hello World</title></head>
Now, I won't pretend to be able to tell you what all of these fields do, and I'm guessing you can't use mine. I've got some field on there I'm sure you don't have and vice versa. But if you manually create a simple wiki page, download it as a template, replace the mso:PublishingPageContent contents with the XML-encoded page contents you want, then uploading like I did before will get you a valid Wiki Page:
//Create a drive item
string pageContents = #"<%# Page Inherits=""Microsoft.SharePoint.Publishing.TemplateRedirectionPage,Microsoft.SharePoint.Publishing,Version=16.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c"" %> <%# Reference VirtualPath=""~TemplatePageUrl"" %> <%# Reference VirtualPath=""~masterurl/custom.master"" %><%# Register Tagprefix=""SharePoint"" Namespace=""Microsoft.SharePoint.WebControls"" Assembly=""Microsoft.SharePoint, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"" %>
<html xmlns:mso=""urn:schemas-microsoft-com:office:office"" xmlns:msdt=""uuid:C2F41010-65B3-11d1-A29F-00AA00C14882""><head>
<!--[if gte mso 9]><SharePoint:CTFieldRefs runat=server Prefix=""mso:"" FieldList=""FileLeafRef,Comments,PublishingStartDate,PublishingExpirationDate,PublishingContactEmail,PublishingContactName,PublishingContactPicture,PublishingPageLayout,PublishingVariationGroupID,PublishingVariationRelationshipLinkFieldID,PublishingRollupImage,Audience,PublishingIsFurlPage,SeoBrowserTitle,SeoMetaDescription,SeoKeywords,RobotsNoIndex,PublishingPageContent,AverageRating,RatingCount,Ratings,e1a5b98cdd71426dacb6e478c7a5882f,TaxCatchAllLabel,LikesCount,Suggested_x0020_Reading""><xml>
<mso:CustomDocumentProperties>
<mso:PublishingContact msdt:dt=""string"">19</mso:PublishingContact>
<mso:PublishingIsFurlPage msdt:dt=""string"">0</mso:PublishingIsFurlPage>
<mso:display_urn_x003a_schemas-microsoft-com_x003a_office_x003a_office_x0023_PublishingContact msdt:dt=""string"">Kubat, Luke</mso:display_urn_x003a_schemas-microsoft-com_x003a_office_x003a_office_x0023_PublishingContact>
<mso:PublishingContactPicture msdt:dt=""string""></mso:PublishingContactPicture>
<mso:RobotsNoIndex msdt:dt=""string"">0</mso:RobotsNoIndex>
<mso:PublishingContactName msdt:dt=""string""></mso:PublishingContactName>
<mso:PublishingPageLayoutName msdt:dt=""string"">EnterpriseWiki.aspx</mso:PublishingPageLayoutName>
<mso:Comments msdt:dt=""string""></mso:Comments>
<mso:PublishingContactEmail msdt:dt=""string""></mso:PublishingContactEmail>
<mso:PublishingPageLayout msdt:dt=""string"">YOUR_SITE_HERE/_catalogs/masterpage/EnterpriseWiki.aspx, Basic Page</mso:PublishingPageLayout>
<mso:TaskStatus msdt:dt=""string"">Not Started</mso:TaskStatus>
<mso:PublishingPageContent msdt:dt=""string"">" + SecurityElement.Escape("<p>Hello, World!</p>") + #"</mso:PublishingPageContent>
<mso:RequiresRouting msdt:dt=""string"">False</mso:RequiresRouting>
</mso:CustomDocumentProperties>
</xml></SharePoint:CTFieldRefs><![endif]-->
<title>Hello World</title></head>";
var drives = await graphClient.Sites[site.Id].Drives.Request().GetAsync();
var pagesDrive = drives.Where(drv => drv.Name == "Pages").Single();
using var stream = new MemoryStream(Encoding.UTF8.GetBytes(pageContents));
var driveItem = await graphClient.Sites[site.Id].Drives[pagesDrive.Id].Root.ItemWithPath("hello.aspx").Content.Request().PutAsync<DriveItem>(stream);
Again, you can't use my code here, you'll need to download your own. And I'm using SecurityElement.Escape() to do my xml encoding of the wiki page contents I want. But if you upload that file it should work.
What's even cooler is that once you have uploaded a valid file you can change the contents programmatically on the list item since it is only the template that's read only:
//Update the list item
var listItem = await graphClient.Sites[site.Id].Drives[pagesDrive.Id].Items[driveItem.Id].ListItem.Request().GetAsync();
FieldValueSet fieldstoUpdate = new()
{
AdditionalData = new Dictionary<string, object>()
{
{ "PublishingPageContent", "<p>Hello, World!</p>" }
}
};
var updatedFields = await graphClient.Sites[site.Id].Lists[pagesList.Id].Items[listItem.Id].Fields.Request().UpdateAsync(fieldstoUpdate);
That's just a bit easier than having to re-upload the whole template every time you want to edit.
This is the code to get the links:
private List<string> getLinks(HtmlAgilityPack.HtmlDocument document)
{
List<string> mainLinks = new List<string>();
var linkNodes = document.DocumentNode.SelectNodes("//a[#href]");
if (linkNodes != null)
{
foreach (HtmlNode link in linkNodes)
{
var href = link.Attributes["href"].Value;
mainLinks.Add(href);
}
}
return mainLinks;
}
Sometimes the links im getting are starting like "/" or:
"/videos?feature=mh"
Or
"//www.youtube.com/my_videos_upload"
Im not sure if just "/" meaning a proper site or a site that start with "/videoes?...
Or "//www.youtube...
I need to get each time the links from a website that start with http or https maybe just www also count as a proper site. The question is what i define as a proper site address and a link and whats not ?
Im sure my getLinks function is not good the code is not the proper way it should be.
This is how im adding the links to the List:
private List<string> test(string url, int levels , DoWorkEventArgs eve)
{
HtmlAgilityPack.HtmlDocument doc;
HtmlWeb hw = new HtmlWeb();
List<string> webSites;// = new List<string>();
List<string> csFiles = new List<string>();
try
{
doc = hw.Load(url);
webSites = getLinks(doc);
webSites is a List
After few times i see in the List sites like "/" or as above "//videoes... or "//www....
not sure if understood your question but
/Videos means it is accessing Videos folder from the root of the host you are accessing
ex:
www.somesite.com/Videos
There are absolute and relative Urls - so you are getting different flavors from different links, you need to make them absolute url appropriately (Uri class mostly will handle it for you).
foo/bar.txt - relative url from the same path as current page
../foo/bar.txt - relative path from one folder above current
/foo/bar.txt - server-relative pat from root - same server, path starting from root
//www.sample.com/foo/bar.txt - absolute url with the same scheme (http/https) as current page
http://www.sample.com/foo/bar.txt - complete absolute url
It looks like you are using a library that is able to parse/read html tags.
For my understanding
var href = link.Attributes["href"].Value;
is doing nothing but reading the value of the "href" attribute.
So assuming the website's source code is using links like href="/news"
it will grab and save even the relative links to your list.
Just view the target website's sourcecode and check it against your results.
I'm forming a newsletter with links to various html modules within my DNN website. I have access to each of their ModuleID's and I'm wanting to use that to get the url. The current approach (made by a third party developer) worked, but only to a degree. The url's are incorrectly formed when the Modules are located deeper in the website.
For example module located at www.website.com/website/articles.aspx is works fine, but a module located www.website.com/website/articles/subarticles.aspx won't. I know this is because the url is incorrectly formed.
Here's the current code:
DotNetNuke.Entities.Modules.ModuleController objModCtrlg = new DotNetNuke.Entities.Modules.ModuleController();
DotNetNuke.Entities.Modules.ModuleInfo dgfdgdg = objModCtrlg.GetModule(ContentMID);
TabController objtabctrll = new TabController();
TabInfo objtabinfoo = objtabctrll.GetTab(tabidfrcontent);
string tabnamefremail= objtabinfoo.TabName;
moduletitlefrEmail = dgfdgdg.ModuleTitle;
string readmorelinkpath = basePath + "/" + tabnamefremail + ".aspx";
ContentMID is the current module ID I'm looking at. I've tried to use Globals.NavigateURL, but that always crashes with Object reference not set to an instance of an object. error. Same thing when I use objtabinfoo.FullUrl so I'm currently at a loss as to how I get the specific modules URL.
EDIT: Here's some more code as to how the tabId is retrieved.
IDictionary<int, TabInfo> dicTabInfo12 = new Dictionary<int, TabInfo>();
ContentMID = Convert.ToInt32(dsNewsList.Tables[0].Rows[i]["ModuleID"]);
dicTabInfo12 = objTabctrl.GetTabsByModuleID(ContentMID);
if (dicTabInfo12.Count > 0)
{
string tester = ""; //Debug
foreach (KeyValuePair<int, TabInfo> item1 in dicTabInfo12)
{
tabidfrcontent = item1.Key;
}
}
You really should be using NavigateUrl to build the links ance if you have the tabid, you are golden.
string readMoreLinkPath = NavigateUrl(tabidfrcontent);
Nice and simple
Okay, colleague suggested this and it works great within a scheduler.
string linkPath = basePath + "/Default.aspx?TabID=" + tabID;
Will Navigate you to the correct tab ID. So this would be the best solution if you're forced to work within a scheduler where you can't use NavigateUrl without some major workarounds.
I'm trying to allow users to post videos on my site by supplying only the URL. Right now I'm able to allow YouTube videos by just parsing the URL and obtaining the ID, and then inserting that ID into their given "embed" code and putting that on the page.
This limits me to only YouTube videos however, what I'm looking to do is something similar to facebook where you can put in the YouTube "Share" URL OR the url of the page directly, or any other video url, and it loads the video into their player.
Any idea how they do this? or any other comparable way to just show a video based just on a URL? Keep in mind that youtube videos (which would probably be most popular anyway) don't give the video url, but the url to the video on the YouTube page (which is why their embed code is needed with just the ID).
Hopefully this made sense, and I hope somebody might be able to offer me some advice on where to look!
Thanks guys.
I would suggest adding support for OpenGraph attributes, which are common among content services which work to enable other sites to embed their content. The information on the pages will be contained in their <meta> tags, which means you would have to load the URL via something like the HtmlAgilityPack:
var doc = new HtmlDocument();
doc.Load(webClient.OpenRead(url)); // not exactly production quality
var openGraph = new Dictionary<string, string>();
foreach (var meta in doc.DocumentNode.SelectNodes("//meta"))
{
var property = meta["property"];
var content = meta["content"];
if (property != null && property.Value.StartsWith("og:"))
{
openGraph[property.Value]
= content != null ? content.Value : String.Empty;
}
}
// Supported by: YouTube, Vimeo, CollegeHumor, etc
if (openGraph.ContainsKey("og:video"))
{
// 1. Get the MIME Type
string mime;
if (!openGraph.TryGetValue("og:video:type", out mime))
{
mime = "application/x-shockwave-flash"; // should error
}
// 2. Get width/height
string _w, _h;
if (!openGraph.TryGetValue("og:video:width", out _w)
|| !openGraph.TryGetValue("og:video:height", out _h))
{
_w = _h = "300"; // probably an error :)
}
int w = Int32.Parse(_w), h = Int32.Parse(_h);
Console.WriteLine(
"<embed src=\"{0}\" type=\"{1}\" width=\"{2}\" height=\"{3}\" />",
openGraph["og:video"],
mime,
w,
h);
}