- Browser makers Apple, Google, Microsoft, and Mozilla have announced Interop 2024, a project to promote web browser interoperability.
- JPEG XL, a potential replacement for JPEG and PNG image formats, was not included in Interop 2024.
- The rejection of JPEG XL has been blamed on Google, with the Google Chrome team deciding not to support the image compression technology.
Archive link: https://archive.ph/nulY6
Am I having a stroke, or is this headline horrendously written?
I read it four times and I still don’t understand what love-in means.
Dumb way of saying orgy.
I still don’t understand. WTF are we talking about. This is tech news, not a celeb scandal. Why can’t we just use simple words !
Why say lot word when few word do trick?
What’s the smart way?
Orgy.
Dinner Party. I think group sex is just implied, right?
I did decline a event because it said “Dinner Party” in quotes.
When they explained, they meant because it’s not really dinner but snacks and board games. Shame. Was expecting orgy
That’s called “Game Night”
Don’t ever go to a game night that is called a “dinner party” anyway. You’re likely to get roped into their self-created board game that is, “okay so it’s got a lot of rules and 1400 pieces, but I’ve written them all down on this 20 page spiral bound document and everyone will get a copy and an hour to read”
P.s. Fuck you Aaron, I will never come to your “dinner party” again.
I believe you just responded to a message with the answer to your question.
Horrible headline.
Browser maker love-in
Chromium (used by most browsers)
snubs
doesn’t support
Google-shunned JPEG XL
JPEG XL (because Google doesn’t like it)
It was chrome and Firefox both who were against the format, both saying too expensive to implement for too small a benefit
Thank you. I legitimately could not understand the title.
Something about love in subs for Google ? And also JPEG?
The register is doing this shit for years. They are trying to sound smart…
I think these days that’s the rule, if it piques your interest but you have some trouble understanding headline, you may just click on the article
What is this bullshit titlegore
Right? I read it like three times thinking I was just missing an inflection or something. Jesus
I still won’t get over it and will keep fighting for JPEG XL. It would fix so many issues and greatly reduce the bandwidth need of the internet while not either having weird licensing or royalties and / or being a „what if we just took one frame from a video“ picture format. Also it can encode back to JPEG lossless for legacy uses. What more could one want?
Well… Google wants weird licensing or royalties, that’s why they keep stamping it down.
I mean there are advantages to using AV1 for photos… Hardware accelerated decoding being one.
Decoding a large AVIF image grid should in theory work on a GPU and happen faster with less power than any software based image format implementation.
AV1 is also just an awesome format that’s entirely free to use out of the gate.
Well yes, however without acceleration JPEG XL is many times faster. Also if you only have a CPU for example.
It’s also highly parallelizable compared to AVIF which also matters a lot considering the amount of cores is growing with the likes of ARM and hybrid architecture CPU.
AVIF also fairs badly with high fidelity and lossless encoding, has 1/3 the bit depth and pretty small dimension limits for something like photography.
I don’t think AVIF is per se a bad format. I just think if I want to replace a photo oriented format I’d like to do that with one that’s focused on „good“ photos and not just an afterthought with up- and downsides.
Also if you only have a CPU for example.
I thought even mobile-tier integrated GPUs can decode AV1 extremely quickly.
Well yes sure, but remember AV1 decoding only became standard like 1-2 GPU generations ago. Encoding only this generation. iPhones only got support with the 15 Pro so it will be another generation before it trickles down to the base models. And what about the hundreds of millions of Android phones in Asia and the likes with dirt cheap SoCs. Pretty sure they don’t have dedicated AV1 decoding hardware for a long time.
So that’s a TON of hardware being made slow and inefficient if everything were to be AVIF tomorrow. Not saying AVIF decoding will be a big hurdle in the future but how long until all this hardware browsing the web has been replaced? That’s why I think somethings that’s efficient and fast on CPUs without any specialised hardware is more suited for a replacement.
Servers often come without GPU, and they’re usually the ones encoding image formats.
I don’t think we should worry about servers meant for image transcoding not having the proper hardware for image transcoding. The problem with the GPU requirement starts and ends with consumer devices imo
Thanks to wasm, you don’t have to bow to Google’s whim and can choose to include jpeg xl support on your websites if you want: https://github.com/niutech/jxl.js
Do you know if it uses the native decoder if available (so, in Safari I guess)? Doesn’t say in the readme.
I believe so. This line in the source code means it’ll only attempt the decoding if an
img
element for a.jxl
image url fails to load.If you’re on safari, you can verify it by going to the demo page at https://niutech.github.io/jxl.js/ and inspect the image element. If the
src
attributes contain blob, then it’s decoded using the wasm decoder. If thesrc
attribute contains url to a.jxl
file, then it’s decoded natively.Very cool, thanks. Will keep this in mind.
I read “wasm” as per “wasp” – white, Anglo-Saxon – and then my brain create “men” because Protestant didn’t make sense. And I continued to read the sentence until context didn’t make sense.
But it still kind of does.
(Yes, I know web assembly is a thing. Just making conversation.)
Wasps are also a type of insect.
I expected Mozilla to implement this, I don’t know how they expect to get marketshare by just following in Google’s footsteps every step of the way.
Is Firefox it’s own browser or just Chrome with a different engine? Even Apple support jxl, well the decoding anyway.Because Mozilla really doesn’t care about what people think anymore. They’re an incredibly bureaucratic group dealing with a lot of red tape placed as a force for good that doesn’t always meet the mark. It’s mainly the reason Firefox doesn’t have a lot of things (that it honestly should have)
Also, Firefox is a completely original browser but it doesn’t have a “chromium” version the browser like Google Chrome does. Both of the Firefox commercial product and the source code compile to the same thing.
I know, it was a rhetorical question given the stance they take on a lot of things always aligning with what Google wants.
Hey friend, for what its worth when i read your question, i was very much channeling this Garth Algar
But with your question about it being its own browser
Firefox is its own.
Is Firefox it’s own browser
Its own browser using the Gecko rendering engine.
It was a tongue in cheek, rhetorical question, regarding what I said before it.
Follow the funding
“Overall, we don’t see JPEG-XL performing enough better than its closest competitors (like AVIF) to justify addition on that basis alone,” said Martin Thomson, distinguished engineer at Mozilla, last year. “Similarly, its feature advancements don’t distinguish it above the collection of formats that are already included in the platform.”
So is this a legit take on the technology? Sounds like an expert in the field is pretty convinced that this file format isn’t really worth it’s weight. What does JXL give the web that other file formats don’t?
Perhaps true from his… perspective. I’ve found JXL surprisingly awesome and easy to use (size, quality, speed, intuitive encoding options with lossless, supported in XnView & XnConvert for easy batches). AVIF was terrible in real-world use last I tried (and blurs fine details).
I’m still a big Mozilla & Firefox fan, but a few decisions over past few years seem like they’re being dictated or vetoed by a few lofty individuals (while ignoring popular user requests). Sad.
I’ve read a comparison of several newer file formats (avif, heic, webp) with jpeg-xl. The conclusion was that jpeg-xl was on par in terms of compression, sometimes better and very fast. also it can re-compress jpgs directly.
here’s an article describing it https://cloudinary.com/blog/the-case-for-jpeg-xl
The big thing, to me, is that it can losslessly encode JPEGs, the dominant format for allllll sorts of archived images. That’s huge for migration of images that don’t necessarily exist in any other format.
Plus, as I understand it, JPEG XL performs better at those video-derived formats at lossless high resolution applications relating to physical printing and scanning workflows, or encoding in new or custom color spaces. It’s designed to work in a broader set of applications than the others, beyond just web images in a browser.
If Google says chromium won’t support a feature it won’t be used. The majority of browsers are Chromium under the hood.
A third party adaptation of Chromium could add support for other formats, the ones we know about right now just don’t bother.
This is the best summary I could come up with:
The process began last year by gathering proposals for web technologies that group members will try to harmonize using automated tests.
The goal is to ensure browser implementations of these technologies match specifications in order to make the web platform better for developers.
Mozilla has not jumped on the JPEG XL bandwagon either: The Firefox maker said it’s neutral with regard to the technology, citing cost and lack of significant differentiation from other image codecs.
“Overall, we don’t see JPEG-XL performing enough better than its closest competitors (like AVIF) to justify addition on that basis alone,” said Martin Thomson, distinguished engineer at Mozilla, last year.
And it has since resisted entreaties to reconsider – despite Apple’s endorsement last year and recent support from Samsung and apparent interest from Microsoft.
“Chrome is ‘against’ because of ‘insufficient ecosystem interest’ and because they want to promote improvements in existing codecs,” said Sneyers, pointing to JPEG, WebP, and AVIF.
The original article contains 907 words, the summary contains 155 words. Saved 83%. I’m a bot and I’m open source!
From Wiki:
JPEG XL supports lossy compression and lossless compression of ultra-high-resolution images (up to 1 terapixel), up to 32 bits per component, up to 4099 components (including alpha transparency), animated images, and embedded previews.
Why 4099 components? Why so many? And why 4099 in particular? 4096+3 with 3 being RGB?
On a side note, 1 Terapixel is just crazy. A square with 1 million pixels has this number of pixels. So, about 1000 of 1080p will fit into this square vertically and about 500 horizontally. How has such eyes to see this all pixel perfectly?
On a side note, 1 Terapixel is just crazy. A square with 1 million pixels has this number of pixels. So, about 1000 of 1080p will fit into this square vertically and about 500 horizontally. How has such eyes to see this all pixel perfectly?
If you zoom in on it (a pretty common thing to do with pictures) enough, most people.
That would be SCI: Miami show zoom, where they can identify a yawning killer by his teeth fillings which image was reflected in the window which image was reflected in the eye of a random person far in the background of a shot.
Kuala Lumpur 846 gigapixels (2014) (may or may not load because their api server’s ssl certificate expires today. if you can’t get it to load, open beta-api.panaxity.com and whitelist it, then reload https://www.panaxity.com/ ). And yes, you can zoom in and see people hanging about in their room on distant apartments.
Would be cool if it can be saved as a single gigantic image instead of tiles of multiple images.