• Browser makers Apple, Google, Microsoft, and Mozilla have announced Interop 2024, a project to promote web browser interoperability.
  • JPEG XL, a potential replacement for JPEG and PNG image formats, was not included in Interop 2024.
  • The rejection of JPEG XL has been blamed on Google, with the Google Chrome team deciding not to support the image compression technology.

Archive link: https://archive.ph/nulY6

        • red_pigeon@lemm.ee
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          1
          ·
          9 months ago

          I still don’t understand. WTF are we talking about. This is tech news, not a celeb scandal. Why can’t we just use simple words !

              • Potatos_are_not_friends@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                1
                ·
                9 months ago

                I did decline a event because it said “Dinner Party” in quotes.

                When they explained, they meant because it’s not really dinner but snacks and board games. Shame. Was expecting orgy

                • TherouxSonfeir@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  10
                  arrow-down
                  1
                  ·
                  9 months ago

                  That’s called “Game Night”

                  Don’t ever go to a game night that is called a “dinner party” anyway. You’re likely to get roped into their self-created board game that is, “okay so it’s got a lot of rules and 1400 pieces, but I’ve written them all down on this 20 page spiral bound document and everyone will get a copy and an hour to read”

                  P.s. Fuck you Aaron, I will never come to your “dinner party” again.

    • ted@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      43
      ·
      9 months ago

      Horrible headline.

      Browser maker love-in

      Chromium (used by most browsers)

      snubs

      doesn’t support

      Google-shunned JPEG XL

      JPEG XL (because Google doesn’t like it)

      • psud@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        9 months ago

        It was chrome and Firefox both who were against the format, both saying too expensive to implement for too small a benefit

    • faintwhenfree@lemmus.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      I think these days that’s the rule, if it piques your interest but you have some trouble understanding headline, you may just click on the article

    • dukatos@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      The register is doing this shit for years. They are trying to sound smart…

    • TheFriar@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      ·
      9 months ago

      Right? I read it like three times thinking I was just missing an inflection or something. Jesus

  • drkt@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    107
    arrow-down
    4
    ·
    9 months ago

    I’m a photographer; AVIF and WebP do not serve my needs, JPEG-XL does.

    I run my own website down to the hardware in my living room; I will not store 5 variations of any 1 picture just so I can serve the best available to clients when JPEG works everywhere and JPEG-XL offers me a lossless transition from JPG to JXL.

    Chromium is literally the only reason JPEG-XL isn’t being adopted right now, and it’s so obvious that Google is pulling those strings.

    JPEG-XL Ride or Die.

    • Potatos_are_not_friends@lemmy.world
      link
      fedilink
      English
      arrow-up
      38
      arrow-down
      1
      ·
      edit-2
      9 months ago

      Mozilla has not jumped on the JPEG XL bandwagon either: The Firefox maker said it’s neutral with regard to the technology, citing cost and lack of significant differentiation from other image codecs.

      Two browser orgs.

      Not arguing just pointing it out

      • drkt@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        25
        ·
        9 months ago

        Mozilla are also dumb, yes, but they aren’t the one in control of 90% of the browser marketshare.

      • Turun@feddit.de
        link
        fedilink
        English
        arrow-up
        16
        ·
        9 months ago

        There is a difference between indifference and actively working against something.

    • ILikeBoobies@lemmy.ca
      link
      fedilink
      English
      arrow-up
      31
      ·
      9 months ago

      I saw a web warning saying “if you cant see {x} then consider upgrading to Firefox” today and it fill me with joy

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      10
      ·
      edit-2
      9 months ago

      Why do you need to transition from jpeg to anything else? Just keep using jpeg for old files.

      Chromium is literally the only reason jpeg-xl isn’t being adopted right now

      That’s not a “reason” it’s a “decision”. Their actual reason is pretty good — they don’t want to support every image format that comes along. That’s a slippery slope, there are several hundred image formats - should they all be supported? How many of them have security flaws? How much work is it to check for security flaws even if none exist?

      The original image formats for the web, jpeg, gif, png, svg, all have major benefits compared to each other. That’s why they were successful. There used to be other widely used image formats but they all fell by the wayside because the goal is to try not to have many formats. Ideally we’d only have one.

      And WebP moves a long way in that direction, it does basically everything except vector images. AVIF is still around for efficiency reasons (it’s very really easy/fast/low battery consumption for camera hardware to create an AVIF).

      JPEG-XL has advantages but unlike those two they are really small and not worth the effort.

      • Turun@feddit.de
        link
        fedilink
        English
        arrow-up
        21
        ·
        9 months ago

        Converting from jpeg to jxl comes some serious space savings and can be done losslessly.

        The original image formats for the web, jpeg, gif, png, all have major benefits compared to each other. That’s why they were successful.

        We change video formats without any major benefits of one over the other. I think it’s totally reasonable to do the same with image formats. Especially the data can be losslessly compressed even more.

        I wouldn’t call speed a major factor for image processing anyway. It’s hugely important for movies, where AVIF is coming from, but much less so when there is no hard 30x2160x3840 pixels/s benchmark you need to reach.

      • drkt@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        12
        ·
        9 months ago

        Just keep using jpeg for old files.

        And bloat up my codebase with support for a new file extension every 2-5 years? I’ll just keep using jpg, then, like the rest of the sane internet, and the format will never die. JXL offered an actual upgrade path, webp and avif doesn’t.

      • TherouxSonfeir@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        9 months ago

        They do run a physical machine in their living room for their website… for some reason…

  • RayJW@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    74
    ·
    9 months ago

    I still won’t get over it and will keep fighting for JPEG XL. It would fix so many issues and greatly reduce the bandwidth need of the internet while not either having weird licensing or royalties and / or being a „what if we just took one frame from a video“ picture format. Also it can encode back to JPEG lossless for legacy uses. What more could one want?

    • Dark Arc@social.packetloss.gg
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      2
      ·
      9 months ago

      I mean there are advantages to using AV1 for photos… Hardware accelerated decoding being one.

      Decoding a large AVIF image grid should in theory work on a GPU and happen faster with less power than any software based image format implementation.

      AV1 is also just an awesome format that’s entirely free to use out of the gate.

      • RayJW@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        19
        ·
        edit-2
        9 months ago

        Well yes, however without acceleration JPEG XL is many times faster. Also if you only have a CPU for example.

        It’s also highly parallelizable compared to AVIF which also matters a lot considering the amount of cores is growing with the likes of ARM and hybrid architecture CPU.

        AVIF also fairs badly with high fidelity and lossless encoding, has 1/3 the bit depth and pretty small dimension limits for something like photography.

        I don’t think AVIF is per se a bad format. I just think if I want to replace a photo oriented format I’d like to do that with one that’s focused on „good“ photos and not just an afterthought with up- and downsides.

        • QuaternionsRock@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          9 months ago

          Also if you only have a CPU for example.

          I thought even mobile-tier integrated GPUs can decode AV1 extremely quickly.

          • RayJW@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            ·
            9 months ago

            Well yes sure, but remember AV1 decoding only became standard like 1-2 GPU generations ago. Encoding only this generation. iPhones only got support with the 15 Pro so it will be another generation before it trickles down to the base models. And what about the hundreds of millions of Android phones in Asia and the likes with dirt cheap SoCs. Pretty sure they don’t have dedicated AV1 decoding hardware for a long time.

            So that’s a TON of hardware being made slow and inefficient if everything were to be AVIF tomorrow. Not saying AVIF decoding will be a big hurdle in the future but how long until all this hardware browsing the web has been replaced? That’s why I think somethings that’s efficient and fast on CPUs without any specialised hardware is more suited for a replacement.

          • anlumo@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            9 months ago

            Servers often come without GPU, and they’re usually the ones encoding image formats.

            • QuaternionsRock@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 months ago

              I don’t think we should worry about servers meant for image transcoding not having the proper hardware for image transcoding. The problem with the GPU requirement starts and ends with consumer devices imo

    • 2xsaiko@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      Do you know if it uses the native decoder if available (so, in Safari I guess)? Doesn’t say in the readme.

      • redcalcium@lemmy.institute
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        9 months ago

        I believe so. This line in the source code means it’ll only attempt the decoding if an img element for a .jxl image url fails to load.

        If you’re on safari, you can verify it by going to the demo page at https://niutech.github.io/jxl.js/ and inspect the image element. If the src attributes contain blob, then it’s decoded using the wasm decoder. If the src attribute contains url to a .jxl file, then it’s decoded natively.

    • Troy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      11
      ·
      9 months ago

      I read “wasm” as per “wasp” – white, Anglo-Saxon – and then my brain create “men” because Protestant didn’t make sense. And I continued to read the sentence until context didn’t make sense.

      But it still kind of does.

      (Yes, I know web assembly is a thing. Just making conversation.)

  • verysoft@kbin.social
    link
    fedilink
    arrow-up
    30
    ·
    edit-2
    9 months ago

    I expected Mozilla to implement this, I don’t know how they expect to get marketshare by just following in Google’s footsteps every step of the way.
    Is Firefox it’s own browser or just Chrome with a different engine? Even Apple support jxl, well the decoding anyway.

    • soulfirethewolf@lemdro.id
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      9 months ago

      Because Mozilla really doesn’t care about what people think anymore. They’re an incredibly bureaucratic group dealing with a lot of red tape placed as a force for good that doesn’t always meet the mark. It’s mainly the reason Firefox doesn’t have a lot of things (that it honestly should have)

      Also, Firefox is a completely original browser but it doesn’t have a “chromium” version the browser like Google Chrome does. Both of the Firefox commercial product and the source code compile to the same thing.

      • verysoft@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        I know, it was a rhetorical question given the stance they take on a lot of things always aligning with what Google wants.

        • Jarix@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          9 months ago

          Hey friend, for what its worth when i read your question, i was very much channeling this Garth Algar

          But with your question about it being its own browser

  • ItsMeSpez@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    9 months ago

    “Overall, we don’t see JPEG-XL performing enough better than its closest competitors (like AVIF) to justify addition on that basis alone,” said Martin Thomson, distinguished engineer at Mozilla, last year. “Similarly, its feature advancements don’t distinguish it above the collection of formats that are already included in the platform.”

    So is this a legit take on the technology? Sounds like an expert in the field is pretty convinced that this file format isn’t really worth it’s weight. What does JXL give the web that other file formats don’t?

    • Vanon@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      9 months ago

      Perhaps true from his… perspective. I’ve found JXL surprisingly awesome and easy to use (size, quality, speed, intuitive encoding options with lossless, supported in XnView & XnConvert for easy batches). AVIF was terrible in real-world use last I tried (and blurs fine details).

      I’m still a big Mozilla & Firefox fan, but a few decisions over past few years seem like they’re being dictated or vetoed by a few lofty individuals (while ignoring popular user requests). Sad.

      • GamingChairModel@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        The big thing, to me, is that it can losslessly encode JPEGs, the dominant format for allllll sorts of archived images. That’s huge for migration of images that don’t necessarily exist in any other format.

        Plus, as I understand it, JPEG XL performs better at those video-derived formats at lossless high resolution applications relating to physical printing and scanning workflows, or encoding in new or custom color spaces. It’s designed to work in a broader set of applications than the others, beyond just web images in a browser.

  • Flipper@feddit.de
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    2
    ·
    9 months ago

    If Google says chromium won’t support a feature it won’t be used. The majority of browsers are Chromium under the hood.

    • anlumo@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      A third party adaptation of Chromium could add support for other formats, the ones we know about right now just don’t bother.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    9 months ago

    This is the best summary I could come up with:


    The process began last year by gathering proposals for web technologies that group members will try to harmonize using automated tests.

    The goal is to ensure browser implementations of these technologies match specifications in order to make the web platform better for developers.

    Mozilla has not jumped on the JPEG XL bandwagon either: The Firefox maker said it’s neutral with regard to the technology, citing cost and lack of significant differentiation from other image codecs.

    “Overall, we don’t see JPEG-XL performing enough better than its closest competitors (like AVIF) to justify addition on that basis alone,” said Martin Thomson, distinguished engineer at Mozilla, last year.

    And it has since resisted entreaties to reconsider – despite Apple’s endorsement last year and recent support from Samsung and apparent interest from Microsoft.

    “Chrome is ‘against’ because of ‘insufficient ecosystem interest’ and because they want to promote improvements in existing codecs,” said Sneyers, pointing to JPEG, WebP, and AVIF.


    The original article contains 907 words, the summary contains 155 words. Saved 83%. I’m a bot and I’m open source!

  • MxM111@kbin.social
    link
    fedilink
    arrow-up
    3
    arrow-down
    7
    ·
    9 months ago

    From Wiki:

    JPEG XL supports lossy compression and lossless compression of ultra-high-resolution images (up to 1 terapixel), up to 32 bits per component, up to 4099 components (including alpha transparency), animated images, and embedded previews.

    Why 4099 components? Why so many? And why 4099 in particular? 4096+3 with 3 being RGB?

    On a side note, 1 Terapixel is just crazy. A square with 1 million pixels has this number of pixels. So, about 1000 of 1080p will fit into this square vertically and about 500 horizontally. How has such eyes to see this all pixel perfectly?

    • hamsterkill@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      11
      ·
      9 months ago

      On a side note, 1 Terapixel is just crazy. A square with 1 million pixels has this number of pixels. So, about 1000 of 1080p will fit into this square vertically and about 500 horizontally. How has such eyes to see this all pixel perfectly?

      If you zoom in on it (a pretty common thing to do with pictures) enough, most people.

      • MxM111@kbin.social
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        9 months ago

        That would be SCI: Miami show zoom, where they can identify a yawning killer by his teeth fillings which image was reflected in the window which image was reflected in the eye of a random person far in the background of a shot.