Background
I'm trying to implement a simple web server part as a web interface to a desktop application.
I first tried HTTPListener which worked but required admin privileges (in some way or another) which I felt was unnecessary. I'm now trying a TcpListener based approach.
Progress
Serving files works good but I'm having a problem with file uploads. I basically tried to use the answer found in another question but instead of plugging in HttpListenerContext.Request.InputStream I used TcpClient.GetStream().
Problem
The problem is that this seems to be working very randomly. Sometimes it works fine but most of the times it doesn't. When it doesn't work the thread doesn't seem to do anything until I press abort in my browser and it proceeds to throw an exception "Start boundary not found" (see the code in the link).
Question
Now, my questions are:
Am I doing this the right way or are there any simpler way to create an HTTP server (third-party libraries included)?
What could be the possible causes for my problem?
What parts of code would you need to see to help me further?
Cassini project is what you need. Also you can look at XSP in mono.
You should be able to embed those projects into your code and host ASP.NET. If that is not possible you can start looking XSP sources and implement the web server part yourself.
Related
I have been some time using asp.net charts, even it was hard to config for asp 3.5 and to make it work, I have been able to move along with all my project but I'm facing something that I have no idea how to solve.
This project works perfectly on my develop enviroment, but when i deployed it on the server, the only page with charts on it appeared like this:
I assume its cause of the charts because its the only thing that really have given me problems and its the only thing that its exclusively on this page but i have no idea how this happens, it only happens on the server with Plesk.
Tried to upload on debug instead of release but the result is the same. Searched overall but never found anything like this.
My Answer may not give you the exact solution. But it may give you an idea's.
Below are my Ideas:
Make Sure you have included all the files into your solution when you do the Build.
Make Sure you have all dependency of Asp.Net Charts in Server.
Check you browser Console and Browser Network. Press F12 in your and check. You can get to know the Exact problem if any issue in loading prob / dependency files.
Double check your server configurations for the charts. And compare your local and server configs.
Do the Deployment in your machine itself and check one round.
I'm sure you all have seen those links are able to run for example Yahoo Messenger application on client side if it is installed on client machine. I want to know bit more about how should I register my protocol (I don't even know what should I call that) to open my specified application.
Please apologize As I don't know what should I tag this question, I tag it with very public tags.
This might be of help to you? :)
As far as i know it's called magnet links... :)
How do I make the website execute links?
Edit: Changed link to more appropriate...
As a company for years we have worked using old ASP (vbscript), we have just started updating to c# .net. our first MVC3 project is ready to be uploaded to the web server for testing and to iron out any bugs.
After reading about it I have made myself fairly familiar with the theory of it.
System.Web.Mvc
System.Web.Routing
System.Web.Abstractions
Have all been set to copy local 'true'
Right clicked the solution and selected 'Publish'
Created a new profile
Filled in the connection details, although I am unsure exactly what is meant by the 'Site Path' and 'Destination URL'
As it stands the site path is the scripting path and the destination URL is the URL as it would be typed into an address bar in a browser.
connection does validate.
in setting I have selected release
The there is a little tick box which seems scary to me, it says "Delete all existing files prior to publish" The server I am uploading to contains all our live and test websites, although I have created a new folder for the project, I under no circumstances want it to touch, edit, modify or delete anything else on the server. So this box is unchecked. Can anyone verify that leaving this unchecked will ensure it does nothing to anything else on the server?
Then in preview it simply says "Your application will be published to: (IP address of server)
Can anyone who has done this before give me some guidance this is the correct method to go? I could do it will less worries through a normal ftp but would like to be able to utilise Visual Studios tools. Its Visual Studio 2012.
Sorry if this isn't the exact correct place for this question.
After trying to do this for a while I discovered that publishing to ftp was a waste of time and the hard way to go about things. Although probably alot of you know this.
Instead I just published to a system file and then uploaded it with cuteftp to the web server. This maybe isn't the most professional way to go about things but from someone that comes from a primarily web scripting background, I found this alot less confusing and alot easier to manage.
I just thought I would answer my own question to resolve this thread.
I am looking to create a desktop application in C# which :
Allows the user to select a file / multiple files / folder containing files from his computer.
Upload the files selected to a PHP script (which is already equipped to handle file uploads using the $_FILES array.)
I'm a PHP developer and have never coded a single line of .NET before. So, you can assume I have no experience with .NET whatsoever.
I have looked this up online and all I seem to come up with are ASP.NET server side upload controls which i do not want. I'm looking for a client side solution. Also, will i have to make any changes in my PHP script ? The script already handles uploads from an HTML multipart form.
If anyone can help me point in the right direction of where to look, what C# controls are available which can help me create the application I need, I would really appreciate it.
The first, and simplest, way to go about this is to use any of the WebClient's UploadFile methods.
Here's some info an an example;
http://msdn.microsoft.com/en-us/library/36s52zhs.aspx
I have a feeling that this will not be enough for you, since you want to upload multiple files in a single request. The WebClient class can be used to manually build a http multipart request, which is probably your best bet.
It's a bit much to explain how to achieve this here on SO, but there are good guides out there.
Here are a couple of very to-the-point articles
http://www.codeproject.com/KB/cs/uploadfileex.aspx
http://www.codeproject.com/KB/IP/multipart_request_C_.aspx
And if you're interested in the details, or better OO design, here's an alternative (a bit harder to follow if you're not experienced with C#)
http://ferozedaud.blogspot.com/2010/03/multipart-form-upload-helper.html
I think both articles should give you enough info to get started.
I want to find a decent solution to track URLs and html content that users are visiting and provide more information to user. The solution should bring minimum impacts to end users.
I don't want to write plugins for different browsers. It's hard to maintain.
I don't accept proxy method, since I don't want to change any of user's proxy settings.
My application is writen in C# and targeting to Windows. It's best if the solution can support other OS as well.
Based on my research, I found following methods that looks working for me, but all of them have their drawbacks, I can't determine which one is the best.
Use WinPcap
WinPcap sniffers all TCP packets without changing any of user settings but only requires to install the WinPcap setup, which is acceptable to me. But I have two questions:
a. how to convert TCP packet into URL and HTML
b. Does it really impact the performance? I don't know if sniffer all TCP traffic is overhead for this requirment.
Find history files for different browsers
This way looks like the easist one, but I wonder if the solution is stable. I am not sure if the browser will stably write the history and when it writes to. My application will popup information before the user leave the current page. The solution won't work for me if browser writes to history file when user close the browser.
Use FindWindow or accessiblity object or COM interface to find the UI element which contains the URL
I find this way is not complete, for example, Chrome will only show the active tab's URL but not all of them.
Another drawback is that I have to request the URL another time to get its HTML content.
Any comment or suggestion is welcome.
BTW, I am not doing any spyware. The application is trying to find all RSS feeds from web page and show them to end users. I can easily do that in a browser plugin but I really want to support multiple broswers with single UI. Thanks.
Though this is very old post, I thought to just give an input.
Approach 1 of WinPcap is the best one. This will work for any browser, even builtin browser of any other installed application. The approach will be less resource consuming too.
There is a library Pcap.Net that has HTTP parser. You can construct http stream and use its httpresponsedatagram to parse the body that can be consumed by your application.
This link helped giving more insight to me -
Tcp Session Reconstruction with Winpcap