• Mitch Effendi (ميتش أفندي)@piefed.mitch.science
    link
    fedilink
    English
    arrow-up
    111
    ·
    edit-2
    4 days ago

    Honestly it’s a little staggering how much better web video got after the W3C got fed up with Flash and RealPlayer and finally implemented some more efficient video and native video player standards.

    <video> was a revolution.

    • FauxLiving@lemmy.world
      link
      fedilink
      arrow-up
      24
      ·
      3 days ago

      I remember, that was a dramatic change.

      Also, most people now dont remember this, but YouTube was initially popular because their flash video player was efficient, worked acrossed many different system configurations and browsers and dynamically changed resolution to match your connection.

      At that point you had some people with broadband connections and a lot more with dial-up. So often dial-up users would not be able to watch videos because they were only available in one resolution.

      YT had 144p (or less!) videos ready for dial-up users and higher resolution videos for broadband users and it automatically picked the appropriate video for the client. This made it so most people (dial-up users) would look to YT first, because you knew that YT would have a video that you could actually watch.

      Then Google bought them.

      • YouTube blew up the year I went to college and got access to a T3 line. 🤤 My school had pretty robust security, but it was policy-based. Turns out, if you are on Linux and can’t run the middleware, it would just go “oh you must be a printer, c’mon in!”

        I crashed the entire network twice, so I fished a computer out of the trash in my parents’ neighborhood, put Arch and rtorrrent on it, and would just pipe my traffic via SSH to that machine. :p

        Ah, and the short era of iTunes music sharing… Good memories.

        • FauxLiving@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          3 days ago

          Yeah, my high school had a computer lab donated by Cisco to teach their CCNA course. There were like 2 students taking the class and 25 PCs, so we setup one to run WinMX, Kazaa and eDonkey.

          They all had CD-RW drives. We were minting music and movie CDs (divx encoded SD movies were under 650MB so they would fit on a CD), and selling them on campus for $3-5. You could get a 100 blank cd-rs for around $40, so it was very profitable.

      • Korhaka@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I forget that people still had dial up in mid 2000s, I always associate it with the 90s

    • lightrush@lemmy.caOP
      link
      fedilink
      arrow-up
      12
      ·
      4 days ago

      Oh man, I was like a kid in a candy shop when I got my hands on Flash 4… built quite a few sites with it.

      • My unpopular opinion is that Flash was perhaps one of the greatest media standards of all time. Think about it — in 2002, people were packaging entire 15 minute animations with full audio and imagery, all encapsulated in a single file that could play in any browser, for under 10mb each. Not to mention, it was one of the earliest formats to support streaming. It used vectors for art, which meant that a SWF file would look just as good today on a 4k screen as it did in 2002.

        It only became awful once we started forcing it to be stuff it didn’t need to be, like a Web design platform, or a common platform for applets. This introduced more and more advanced versions of scripting that continually introduced new vulnerabilities.

        It was a beautiful way to spread culture back when the fastest Internet anyone could get was 1 MB/sec.

        • RheumatoidArthritis@mander.xyz
          link
          fedilink
          arrow-up
          5
          ·
          3 days ago

          It worked great only on Windows PCs in the times when PC and Windows still weren’t the definite winners of the technological race and people have been using all kinds of computers.

    • lightrush@lemmy.caOP
      link
      fedilink
      arrow-up
      46
      ·
      edit-2
      4 days ago

      It’s a Framework with 11th gen Intel i5. I’ve never seen it below 11W while doing this. I don’t recall the exact number I got in Debian 12 but I think it was in the 11-13W range. The numbers were similar with Ubuntu LTS which I used till about a year ago. Now I see 9-10W. The screen is 3:2 13". Not sure about the enconding but I have GPU decoding working in Firefox.

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      18
      ·
      4 days ago

      It’s a youtube video so whatever youtube is these days. I tested with this M1 Macbook Pro and it was using about 7 watts so 3 watts more is pretty good for pretty much anything. I think my 12th Gen. laptop typically draws about 13-15 doing the same thing, but with a much dimmer screen.

    • ChaoticNeutralCzech@feddit.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      3 days ago

      You can use the Wattz app to monitor current/power flowing into/out of the battery on some Android phones. Yes, 3 W is about the average in normal use. Unfortunately you cannot gauge the power consumption while charging unless you have a USB wattmeter too: the system only measures battery current because it’s required for battery capacity/percentages.

  • serenissi@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    3 days ago

    I’ve seen 10-12W easily on 4K for soc without av1. your soc (intel 11 gen) should support av1. try to play the video on mpv (with yt-dlp integration) with various hw acceleration options to see if it changes. probably your browser is software decoding.

    for hardware decoding supported soc too I noticed 2-3W of extra power usage when playing youtube from website compared to mpv or freetube. the website seems doing inefficient js stuffs but I haven’t profiled it.

    • HiddenLayer555@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      3 days ago

      I wish there were more M.2 cards beyond just SSDs and wireless NICs. The idea of a small form factor PICe interface is underutilized and things like hardware codec accelerators can keep laptops with older processors usable with new standards for longer. It’s sad how PCMCIA had an entire ecosystem of expansion cards yet we somehow decided that the much higher bandwidth M.2 is only for storage and networking. Hell, do what sound cards in the 90s/00s did and have M.2 SSDs specifically designed for upgrading older laptops that also have built in accelerators for the latest media standards. Hardware acceleration is energy efficient and can probably just be bundled into the flash controller like they’re bundled into the processor, and unless you have a top of the line SSD you’re probably not saturating the M.2 interface anyway.

      • ☂️-@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        capitalism underutilizes tech and its sad. we could be in 2085 already if we didn’t just waste time and materials on shit made to be thrown away in a few years.

    • gnuhaut@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      3 days ago

      There’s a browser extension called “Your Codecs.” which can prevent YouTube from serving you AV1-encoded videos.

        • ozymandias117@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          3 days ago

          AMD has been proving that x86_64 can be at least as power efficient as ARM over the last few years (given a floor of performance for like a phone/laptop… I doubt it can get as low power as a little ARM microcontroller)

          It seems like x86 was getting so power hungry because of Intel’s laser focus on single core performance

  • LaLuzDelSol@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    2 days ago

    That is very impressive! Although to be honest I question the accuracy of all those estimated power draws. I would be interested to see an endurance test of your battery- assuming your battery capacity is accurate, your runtime on a full charge should line up with your power draw.