I am using Oracle Right now CRM. while opening the cloud application users are getting the error messages quite often, not all the time. first we thought it was because of the bandwidth usage so we tried allocation 1:1, 10 MBPS leased line for it. but still the same error is throwing.
http://textuploader.com/kyt2
Are you behind a firewall? The RightNow app requires that quite a few hosts be opened up to outgoing traffic. The list of hosts can be found in the RightNow Environmental Configuration Guides (https://cx.rightnow.com/app/answers/detail/a_id/2364/kw/Environmental%20configuration%20guide).
Find your CX version and check to make sure that outgoing traffic is open to all required hosts including:
*.custhelp.com
*.rightnowtech.com
*.rightnow.com
*.rnttraining.com
*.livelook.com and *.livelook.net (for co-browse functionality)
*.birst.com (for Enterprise Analytics functionality)
*.hivelive.com (for Oracle RightNow Social functionality)
*.rnengage.com
The guides contain other notes around network requirements that my be helpful.
Related
This is my first Stack Overflow question so apologies if this isn't great...
I'm sure this is something either super simple I am missing or something very complex that I've gotten myself into, but I am using ClickOnce for the first time to create an automated updater for a company application I developed.
The application itself was originally written in VB but I have translated it into C#. We use this to automate a database of assets, which changes very frequently. I have been tasked to allow it to complete automated updates to keep from confusing some of the techs with uninstall/reinstalling the application weekly.
I volunteered to make an FTP server using a personal server machine I use at home. Normally this machine would be used for local networking but I've wanted to create an FTP server for some time (this is my first FTP server too).
So I went on my way, set the publish location for the build to ftp://[IP.ADDRESS]:21/Folder/Subfolder and the Installation folder URL to http://[IP.ADDRESS]:21/Folder/Subfolder
Long story short, when I try to test an update (changing only the assembly version), I am an error:
System.Deployment.Application.DeploymentDownloadException: Downloading http://[IP.ADDRESS]:21/Folder/Subfolder/applciation.application did not succeed ---> System.Net.WebException: The server committed a protocol violation.
I did some research and tried adding an SSL certificate and changed the update path to https://[IP.ADDRESS]:21/Folder/Subfolder/ then tested that. This time around, I get this error:
System.Deployment.Application.DeploymentDownloadException: Downloading http://[IP.ADDRESS]:21/Folder/Subfolder/applciation.application did not succeed ---> System.Net.WebException: The underlying connection was closed: An unexpected error occurred on a send. --> System.IO.IOException: The handshake failed due to an unexpected format.
I cannot tell if this is progress or if I moved backwards here LOL. I've been jumping back and forth and going to many threads to try to figure out where this is going wrong. I'm also having a pretty tricky time finding out if this is an error with how I've set up ClickOnce or if this is an error in how I have set up FTP with IIS.
Apologies if this is not enough information, I can provide more if necessary. Also apologies if this is too much information! Any help or guidance is appreciated!
I'm guessing you're working for a small company and infrastructure/resources are at a premium. With that in mind I'll offer some suggestions:
Does your company have a network shared drive? I don't like ClickOnce, but I have deployed it to network shares in the past with success. This has the benefit of you not needing to deal with security.
Have you considered migrating this to a web application? Web development seemed really daunting when I was a native app developer, but with Blazor and ASP.NET Core it's become a lot more accessible. This would completely get rid of the need for updating the application.
Consider an alternative deployment route. ClickOnce is not incredibly well supported.
I'd be remiss if I didn't throw a red flag on security. FTP is a very old protocol and is basically insecure by design. Hosting it on your home server means that you're transmitting the app over the public internet... What would happen if someone outside your company installed the application?
I have a c# app that exports data from the database to the user's Exchange Calendar, using Microsoft.Exchange.WebServices (15.0.0.0). My app is DotNet 4.5.2 and I use Visual Studio 2013.
The app has been working fine for a year, and even now works 95% of the time. The app runs constantly on machine that I am logged into and monitor. This is a low volume process, perhaps 50 items get exported per day at most.
Every couple of days, the program will give this error:
"The request failed. Unable to connect to the remote server"
when the app is attempting to create an EWS connection to the server.
It will do this for maybe 10-20 items. So I shut it down, run it again, and it works perfectly fine on all the records that failed before.
This is my first EWS app, but I've been programming 30+ years and do have somewhat little knowledge of internet based apps.
Any helpful information or suggestions would be greatly appreciated!
Thank you kind people!
The internet often works, but should never be considered reliable. Anything using it should handle errors, and retry if desired. The classic approach is Exponential backoff. Its possible something nasty is going on, like your ISP swapped your IP address, or just intermittent failure. I don't know anything EWS specific, but there may also be sources of flaky issues there as well.
I have a bunch of small desktop applications for which I have a simple database for keeping user data (who uses which app and in which version) etc.
I want the apps to connect to Azure SQL server and update database record when they're started. My apps have the ADO.NET connection string hardcoded in them.
It works fine from my home network and my company guest network - however, the corporate network has got some of the ports disabled, and that apparently includes port 1433. As per Microsoft troubleshooting guide, I tried telnet and failed.
C:\Users\xxx>telnet 65.55.74.144 1433
Connecting To 65.55.74.144...Could not open connection to the host, on port 143
: Connect failed
I cannot connect neither via my applications, nor by SQL Server explorer in Visual Studio.
So, the question is - how can I get around this problem? It is highly doubtful that corporate IT will unlock a port just because I ask, besides I want to keep it as simple, low profile and independent as possible. Or maybe my approach is incorrect from the very beginning and I should do stuff differently?
Cheers
Bartek
You can't.
Make your desktop applications talk to web services instead, over HTTP/HTTPS. Among other things this will also allow a more controlled access (right now anyone can connect to your database and modify the data, since your access credentials are publicly shared with your app).
A side effect of using we services is that 80/443 are almost always opened in all corp firewalls.
I am having an issue regarding Team Foundation Server where i am getting the error 'Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.' whenever i try to check in a project. I also get this error from time to time when i try to 'Get Latest Version'. I have attempted to use TFS in both Visual Studio 2010 & Visual Studio 2013 but i get the same issue.
I have also tried the following:
Remapping my TFS Source Control
Deleting all files from the local path of my source control and redownloading
Turned my firewalls off
Switching the port in which my Ethernet cable is connected to
I have included a screenshot of this error below:
Does anyone have any idea of how to fix this, i would be massively grateful!
I came across this article which talks about this exact same problem. The author of the article talks about this error being related to http.sys bug.
Below is an excerpt from that article
Http.sys is the http protocol stack that IIS uses to perform http
communication with clients. It has a timer called MinBytesPerSecond
that is responsible for killing a connection if its transfer rate
drops below some kb/sec threshold. By default, that threshold is set
to 240 kb/sec. It turns out that there is a bug with this timer and
it is causing connections to be prematurely killed. We have found that lowering this threshold reduces the number of connections that are killed by the server.
See if that helps?
Note: As mentioned in the article, the hotfix and settings have to be changed in the Application Tier (AT). TFS consists of a Application tier and a Database Tier. If your unfamiliar with the term, then you probably have just a single server installation, which means both the AT and DT are on the same server.
I'm working on a graduation project for one of my university courses, and I need find some place to run several crawlers I wrote in C# from. With no web hosting experience, I'm a bit lost. Is this something that any site allows? Do I need a special host that gives more access to the server? The crawler is a simple app that does its work, then periodically writes information to a remote database.
A web crawler is a simulation of a normal user. It acess sites like browsers do, getting the html code (javascript, etc.) returned from the server (so no internal access to server code). Being that, any site can be crawled.
Be aware of some web crawler ethics guidelines. There are pages you shouldn't index or follow its links. And web developers build some files and instructions to web crawlers, saying what you can index or follow.
If you can't run it off your desktop for some reason, you'll need a host that lets you execute arbitrary C# code. Most cheap web servers don't do this due to the potential security implications, since there will be several other people running on the same server.
This means you'll need to be on a server where you have your own OS. Either a VPS - Virtual Private Server, where virtualization is used to give you your own OS but share the hardware - or your own dedicated server, where you have both the hardware and software to yourself.
Note that if you're running on a server that's shared in any way, you'll need to make sure to throttle yourself so as to not cause problems for your neighbors; your primary issue will be not using too much CPU or bandwidth. This isn't just for politeness - most web hosts will suspend your hosting if you're causing problems on their network, such as denying the other users of the hardware you're on resources by consuming them all yourself. You can usually burst higher usage levels, but they'll cut you off if you sustain them for a significant period of time.
This doesn't seem to have anything to do with web hosting. You just need a machine with an internet connection and a database server.
I'd check with your university if I were you. At least in my time, a lot was possible to arrange in-house when it came to graduation projects.
Failing that, you could look into a simple VPS (Virtual Private Server) account. Unless you are sure your app runs under Mono, you will need a Windows one. The resource limits are usually a lot lower than you'd get from a dedicated server, but they're relatively affordable. Some will offer a MS SQL Server database you can use next to the VPS account (on another machine). Installing SQL Server on the VPS itself can be a problem license wise.
Make sure you check the terms of usage before you open an account, as well as the (virtual) system specs though. Also check if there is some kind of minimum contract period. Sometimes this can be longer than a single month, especially if there is no setup fee.
If at all possible, find a host that's geographically close to you. A server on the other side of the world can get a little annoying to access remotely using Remote Desktop.
80legs lets you use their crawlers to process millions of web pages with your own program.
The rates are:
$2.00 per million pages
$0.03 per CPU-hour
They claim to crawl 2 billion web pages a day.
You will need a VPS(Virtual private server) or a full on dedicated server. Crawlers are nothing more then applications that "crawl" the internet. While you could set up a web site to be a crawler, it is not practical because the web page would have to be accessed for you crawler to work. You will have to read the ToS(Terms of service) for the host to see what the terms are for usage. Some of the lower prices hosts will cut your connection with a reason of "negatively impacting the network" if you try to use to much bandwidth even though they have given you plenty to use.
VPS are around $30-80 for a linux server and $60+ for a windows server.
Dedicated services run $100+ for both linux and windows servers.
You don't need any web hosting to run your spider. Just ask for a PC with web connection that can act as a dedicated server,configure the database and run the crawler from there.