There is an important takeaway twinkling through the data: JS failing to load can happen to everyone. Think of Google Analytics having a hiccup and taking the web with them, happening regularly before async tracker embedding. This is a key part of accessibility, that should be taught with the sledge hammer to each able-bodied developer on earth: It can happen to you, too, and you don't need to run in front of a bus for it. Develop the site, so that it's robust concerning all possible ways the receiving end may struggle.
Comment by Jake Archibald jaffathecake posted on on 22 October Browsers with prescanners all modern ones could make up some of that 0. The browser could decide to preload a page that it thinks will be loaded, so it fetches the html and downloads the assets, but doesn't execute script until the page is actually opened. If the page isn't opened, it'll fall within that 0. Comment by Ian Culshaw culshaw posted on on 22 October Comment by michaeldfallen posted on on 22 October That would be a difficult situation to test for, any idea how we might improve the test to distinguish between a browser using a prescanner and a browser failing for some other reason to download either of the with-js or without-js images?
Browsers send a HTTP header, when they pre-fetch ressources, e. This might help? Comment by Steve Souders posted on on 22 October Jake is right. I bet nearly all of this 0. Comment by Pete Herlihy posted on on 22 October Yeah, very good point Jake. I have updated the post to include this reason.
Comment by aegisdesign posted on on 22 October I wouldn't think so, if a browser isn't looking for images then it wouldn't have requested the base image either - which is what we are taking as the total number of visits. So our numbers might be very slightly low overall, but of those that did request the base image, these proportions won't be affected by this.
Comment by Mo posted on on 21 October I suspect the primary cause will be local blocking: anecdotally, I know a few people who routinely browse only with JS whitelisted and that's only for sites which require it ; unfortunately, browsers tend to offer it only as a global switch, so the extensions which implement selective blocking or selective anti-blocking must have JS enabled globally, breaking the noscript element, in order for the whitelist to work. Comment by Mike Davies posted on on 21 October The Yahoo test was flawed in that any user-agent that wasn't A-grade was redirected to a static html page, and thus weren't measured.
Thus their numbers underrepresented the size of the non-JavaScript running audience. Really glad you didn't fall into that rabbit hole. Now, my challenge for you, you have ideas about the reasons why JavaScript was failing to execute. Can you test your thesis and paint us a better picture as to what the main culprits that cause JavaScript not to run.
You've started that path already by successfully reporting how many people chose to disable JavaScript in their browser. Thanks again for the splendid work, and demonstrating once again why evergreen techniques like progressive enhancement work with the strengths of the web, and remain constantly a web development best practice. Comment by samjsharpe posted on on 22 October We are missing detail on the 0.
Correlating those requests to enable us to conclusively say which unique agent asked for base-js but neither of the other beacon images would involve collecting a lot more fingerprinting information about those agents to uniquely identify them and unfortunately for us the practical and privacy concerns outweigh the benefit from that. I'm sure we'd be really interested to see more research in this area, but for GOV. UK it's critical to maintain the trust of the users.
It's therefore important that we don't abuse that trust by trying to identify and track them more than we absolutely need to. Comment by Mike Brittain posted on on 22 October The code for this experiment is on github linked from article.
Why not run this yourself and provide updates on your own analysis, and your own observations on how the results may be skewed? What would be interesting to see with these results is a breakdown of the browsers, versions, and OSes that had "failing" JavaScript. Is there worse behavior amongst certain classes of user-agents?
How about by ISP some satellite service providers have very bad caching and prefetching behaviors? This is an excellent experiment! And I love seeing results like these being made available to the public. Thank you! This description of the Yahoo experiment is misleading I should know, I'm the one who ran it and wrote up the results.
While it's true that the experiment counted only A-grade browsers, what Mike fails to mention is that the C-grade traffic was so small as to be statistically insignificant. Comment by Adam Bankin posted on on 21 October Comment by Pete Herlihy posted on on 21 October Good question, but yes, these were considered and discarded. Bots and spiders won't usually request assets from a page they crawl - and looking at the user agent data this looks to be confirmed.
Thanks for getting back to me. Finland was a great surprise: 1. From our measurements, 1. Another surprise in our data is that traffic from Africa had high JavaScript disabled rates.
Sign up to our mailing list below to get that and other reports when we produce them. Research into users that disable JavaScript is not new. Then, Yahoo! They found that 0. Give the basic info, with a clear route for how to go further - update your browser! Improve this answer. Matthew Trow Matthew Trow 2, 1 1 gold badge 12 12 silver badges 13 13 bronze badges.
How is this constructive? Whilst true, it doesn't answer the question in any way whatsoever. The OP is asking what percentage of users have JavaScript disabled, not reasons why it is dumb to support said users. This does not answer the question. It is super simple to update to a modern browser these days. I think this clearly is an answer to the question. Show 11 more comments. Discussions Most active and extensive discussions on StackExchange sites on this topic: P.
PunkChip, Stats You're right Personal Thoughts In my personal opinion , it's fair enough to require some very specific areas of a site to require JavaScript, but you should try as much as possible to provide an alternative if that's the case. Community Bot 1 1 1 silver badge. That 1. They can easily access that information just by updating their technology for free.
If it was true there was no way for them to access it, than I would agree, but I don't think that's the case. Just my opinion, but JavaScript is now a basic language of the web and I think it's time we stopped making excuses to add work for ourselves for an incredibly small minority of people who CHOOSE to limit their experience usually for outdated reasons. I don't think you get the point. Unimpaired people that aren't using javascript are usually doing so nowadays for privacy reasons a lot harder to track you on the web , but some people who are visually impaired are using javascript-reliant technologies to help them get around the web without vision.
These people don't care about old technologies, they care about being able to "view" the web at all. And while smaller company sites should really feel no need to accommodate them, sites like universities and government, where info is key, definitely should. These stats are now almost 4 years old and even smaller. And how many businesses do you know that are going to tell their blind employee they can't use a browser new enough to have modern screen reading technology?
Practically everyone who disables JS now does it by choice - even impaired people. It's time we let go of these old, incredibly outdated notions that we can't use Javascript! Show 4 more comments. In any case, it looks to be a tiny proportion. Luke Puplett Luke Puplett Add a comment.
Think also of the emerging global markets; countries still battling to build a network of fast internet, with populations unable to afford fast hardware to run CPU-intensive JavaScript.
Or think of the established markets, where even an iPhone X on a 4G connection is not immune to the effects of a partially loaded webpage interrupted by their train going into a tunnel. The web is a hostile, unpredictable environment, which is why many developers follow the principle of progressive enhancement to build their sites up from a core experience of semantic HTML, layering CSS and unobtrusive JavaScript on top of that.
I wanted to see how many sites apply this in practice. What better way than disabling JavaScript altogether? After disabling JavaScript, my first port of call was to go to my personal portfolio site — which runs on WordPress — with the aim of writing down my experiences in real time. I felt quite comfortable without the toolbars until I needed to embed screenshots in my post. But I was quite surprised that the separate media screen also required JavaScript!
There was no way of determining the thumbnail URL of the uploaded image, and any captions I wrote also had to be manually copied. I soon got fed up of this approach and planned to come back the next day and re-insert all of the images when I allowed myself to use JavaScript again. Finally, I have a GitHub embed on my site.
I was half hoping to shock you with the before and after stats megabytes of JS for a small embed! End of the world! Leading by example! For the sake of a styled tweet, a GitHub embed and a full-fat Instagram embed, my site grows an extra KB. Websites — or more accurately, their individual pages — tend to fall into one of the following categories:.
At first glance, Amazon does a cracking job with its non-JavaScript solution, although the main product image is missing. On closer inspection, quite a few things were a bit broken on the noscript version. It would have been nice if these thumbnails were links to the full image, opening in a new tab. They could then be progressively enhanced into the image gallery by using JavaScript:.
The Amazon integrated modal form requires JavaScript to work. This could be progressively enhanced into the integrated modal using Ajax to download the HTML separately. On the whole, I was actually pleasantly surprised just how well the site worked without JavaScript.
It could just as easily have been a blank white page. At first, the site looked indistinguishable from the JavaScript-enabled version.
Unfortunately, the price history chart did not render. It did provide an alt text fallback, but the alt text did not give me any idea of whether or not the price trend has been going up or down. General suggestion: provide meaningful alt text at all times.
0コメント