This is a general question really. I had read some where, it may of been an Microsoft whitepaper or blog I am really not sure as I dont have the link right now.
Basically, the person was describing that (of which he referred to AJAX.Net 1.0) when using an update panel, although you would expect only the controls and components associated to be updated/refreshed, the entire page is submitted with the request in the update panel.
Obviously this defeats the purpose of AJAX when the design is to minimize traffic sent to and from the server. In this case you might as well do a full post back for the page. I guess from a cosmetic point of view, AJAX.Net does the trick as intended but behind the scenes doesn't do what you would expect.
Now this could well of been resolved in later versions I just can't confirm this. I have searched Google high and low for an answer.
What the person said was to use JQuery as this offers true optimized traffic flow when updating, which of course it does, so this is why I ask have Microsoft done the same with their later versions.
Just thought I'd ask you lot before attempting the impossible task of asking someone in Microsoft.
I'll have a look for the link when I get home and if I find it I'll add it here just so you don't think I'm off my rocker. :)
This post ASP.Net AJAX Not Working (Full page postback) explains pretty well about the problem and the solution.
Related
I am real ASP.Net beginner and I could use advice from someone wiser than me to give me direction where to look and what to learn.
I am working on ASP.Net/C# webapp for our dept (target .Net is 4.8).
In short: I need to update page for multiple users every time certain value changes in the code behind and do it as fast as possible
Long: There is mandatory feature I can't bypass. Situation: 50 users will be on the same aspx page, watching it. There will be field with certain value. (it can be text field or whatever element will help me to achieve this). Admins can change the value of this element by clicking one of the related buttons. If such event happens, it is absolutely crucial to reflect this change in all of the browsers simultaneously and as fast as possible. Miliseconds matters. (not my idea)
What would be the best way to achieve this?
Please excuse me if my question is too silly. I am not expecting some sample/solution code on such badly formulated question, just general direction where to look (what to learn) as I could not find any solution so far. There should be something since it kind of resembles auction system, you bid, others see it immediatelly and can take action.
thank you in advance
Your use case seems to fit well with the usage of signal-r.
Basically you need something which allows you to set a bidirectional communication between the client and the server, so that you are able to broadcast a message from the server to all the connected clients whenever something important happens on the server.
You can start from the official documentation. This topic is broad, so you need to first of all read the docs and play with some of the provided sample applications. The advantage of this framework for real time web applications is that it fits really well in the ASP.NET and ASP.NET core ecosystem.
Avoid any solution based on client polling of the server state, by doing so you will basically ddos yourself. A push-based solution (like the one offered by signal-r) is way more efficient.
I am working on redesigning and streamlining the checkout process for an Ecommerce website using ASP.NET/C# into 1 page. (Currently, the website doesn't use any ASP controls)
Just for a little back info, the customer will be able to enter a shipping address, add a gift message, and select a shipping date per recipient while updating the backend with the correct information. I want this process to be elegant and pain free for customers to use.
Now I have created WebMethods that get called through AJAX requests that work correctly on previous features.
My question is:
Would it be an ok idea to submit ALL these requests through WebMethod AJAX calls, have it update the backend, and render the correct information to the user?
I know it is doable but I just want to be sure that this is an ok approach to take. I have been looking at other JavaScript frame works that might help but am too unfamiliar with them to know if they would work or not.
Please let me know what you think and if you have other suggestions.
Thank you!
Short answer: Yes.
You mentioned:
an Ecommerce website using ASP.NET/C# into 1 page
If that is true, then you are talking about a single page web app. There are a couple of references to single page web apps here:
https://msdn.microsoft.com/en-us/magazine/dn463786.aspx
https://en.wikipedia.org/wiki/Single-page_application
It's very common to do what you are talking about. Using AJAX to perform back-end processes and then update small pieces of the front end is key to making fluid, fast, and responsive web applications. That's mainly because smaller pieces of data are being sent through the network, and the UI is being updated in smaller segments. If architected correctly, single page web apps can provide a slick, intuitive web experience for clients and customers. Emphasis on "If architected correctly". So make sure to do your research.
There are several libraries out there that can help you, such as:
jQuery
jQueryUI
Bootstrap
Angularjs
Sencha ExtJs
Backbone
Knockout
You don't have to use them, but they can save you a lot of time. At the very least, I would recommend using jQuery. Do some research and find out which one(s) will work best to fit your specific application.
I don't see any problem in this approach, the good part about ajax WebMethod is that you will transfer only the important data to do the operation. its better and fast than use a updatePanel
We use a lot of ajax webMethod in our asp.net project and we never got problems with it :)
A good advice is looking for asp.net mvc, its better than asp.net web forms
I would like to integrate google analytics experiments to our website.
My situation: We have a solution for feature toggles, that also allows A/B testing. The features are stored in the database and have a percentage that defines how many users will see the feature. We also store the features in a cookie, so the users will see the same view when he refreshs the page.
Now I want to use server the javascript API to track the state of the feature (Google Experiments Documentation). In my understanding I have to send a request to google whenever an experiment is used and I also have to tell google the ID of the alternative. It must happen on the right page to correlate the experiment with the right page view. The problem are ajax requests which might check the split tested features. In this case it is hard to say which features are used for the current page.
I see 3 options:
Track all experiments, even if they are not used on this site. (I think it does not make a lot of sense)
Make a tool/configuration to define the experiments and feature toggles that are used on each site (very easy to make mistakes here).
Track the experiments from server-side code (but I don't know how GA will connect these calls to a page view).
What is the official guideline for ajax requests and google analytics experiments?
I keep hearing that the server side ASP .NET AJAX controls (like UpdatePanels) are not truly AJAX though they seem like it, because rendering is not completely on the client side. I'm trying to understand this with more clarity. Could someone elaborate?
Thanks...
UpdatePanels came out fairly early in the AJAX cycle, and they're heavy (they can emit around 100-300k of JavaScript). Behind the scenes, UpdatePanels post the entire page back to the server via a JavaScript XMLHttpRequest. The new page is generated with the normal page lifecycle just like a postback, but only the parts that live inside the UpdatePanel (plus the parts necessary for updating ViewState and so on) are sent back to the client. From there, the markup is inserted without a visible flash or interruption of page state.
Most competing AJAX tools lean towards super lightweight implementations that let you ship or generate a small chunk of HTML via Javascript, and I would say that's the dominant direction today, especially outside the ASP.NET world.
The difference in total amount of data sent across the wire is huge -- see the link below. In low-traffic situations it might not make a bit of difference, but in the case of a site like StackOverflow, it would show up on the bandwidth bill for sure.
All that said, I don't think it's fair to say that UpdatePanels are not actually AJAX, since they do ship HTML around via asynch JavaScript -- it's just that there's a gigantic, often cumbersome framework on top. UpdatePanels get a bad rap sometimes, but they provide a brilliantly simple developer experience. I've often found them useful in low-traffic situations.
Update: Here is an article (old but still valid) that examines the payload UpdatePanels ship to and from the server. It also goes into Page Methods, which is a lightweight, Web Service-based alternative to UpdatePanels. This is an oft-overlooked part of Microsoft AJAX.
http://msdn.microsoft.com/en-us/magazine/cc163480.aspx
Maybe this will answer???
I hope. (Reading it as well).
Seems to me that the Control is a Server Side object that uses ajax as a mechanism and that the ajax is rendered to do the client side. In this sense It isn't pure ajax but, rather a blending of multiple solutions.
:)
I am trying to detect is a visitor is human or not. I just got an idea but not sure if this will work or not. But if I can store a cookie on the persons browser and retrieve it when they are browsing my site. If I successfully retrieve the cookie can this be a good technique to detect bots and spiders?
A well-designed bot or spider can certainly store -- and send you back -- whatever cookies you're sending. So, no, this technique won't help one bit.
Browsers are just code. Bots are just code. Code can do anything you program it too. That includes cookies.
Bots, spammers and the like work on the principle of low-hanging fruit. They're after as many sites or users as they can get with as little effort as possible. Thus they go after popular packages like phpBB and vBulletin because getting into those will get them into a lot of sites.
By the same token, they won't spend a lot of effort to get into your site if the effort is only for your site (unless your site happens to be Facebook or the like). So the best defense against malicious activity of this kind of simply to be different in such a way that an automatic script already written won't work on your site.
But an "I am human" cookie isn't the answer.
No, as Alex says this won't work; the typical process is to use a robots.txt to get them to behave. Further to that, you start to investigate the user-agent string (but this can be spoofed). Any more work than this and you're into CAPTCHA territory.
What are you actually trying to avoid?
You should take a look at the information in the actual http headers and how .Net exposes these things to you. The extra information you have about the person hitting your website is there. Take a look at what Firefox is doing by downloading Live Http Headers plugin and go to your own site. Basically, at a page level, the Request.Headers property exposes this information. I don't know if it's the same in asp.net mvc though. So, the important header for what you want is the User-Agent. This can be altered, obviously, but the major crawlers will let you know who they are by sending a unique UserAgent that identifies them. Same thing with the major browsers.
I wrote a bot that works with cookies and javascript. The easiest way of bot/spam prevention is use Nobot component in Ajax Control Toolkit.
http://www.asp.net/AJAX/AjaxControlToolkit/Samples/NoBot/NoBot.aspx