Username from a Quake Champions bot.

  • 1 Post
  • 7 Comments
Joined 1 year ago
cake
Cake day: June 7th, 2023

help-circle
  • I don’t think you know what I’m getting at. I know about audio cards, as I’m an audiophile. I can tell you with confidence that DACs can only convert digital sound data into analogue, and that’s due to the audio jack being older than digital audio.

    The problem with your examples (GPUs, ASICs, and FPGAs) is that they’re digital devices. An analogue device isn’t compatible with a digital device, much like how digital sound data (songs, audio tracks in videos, system sounds, etc…) and analogue audio don’t technically work. They only work because the quality of sound get downgraded during the jump to digital recording methods.

    If you look at many older albums, like “The Piper at the Gates of Dawn,” you may notice that they are offered at a very high quality (24bit 192kHz is common.) This is due to them being recorded on audio tapes, which could store incredibly high resolution sound. This is the same situation with film, and is the reason why old films can still be rerealeased in higher resolutions (assuming that the film the movie was originally recorded is still around.) Newer albums. however, often cap out at 24bit 48kHz, as digital sound requires the sound quality to be preconfigured. Analogue just records.

    When you’re listening to sound on a digital device, you’re always dealing with compression of some kind, as the sound may be “lossless” in the sense that the audio file was recorded in CD quality. This is because storing analogue data is impossible on digital storage devices. What’s actually done is a lot like a smooth wavelength that a bunch of pillars trying to match. The pillars may get close, but they will never actually be the wavelength due to their shape.

    Using an analogue device to accelerate something requires at least some information to be lost on translation, even if the file size is stupidly large. All in all, getting analogue data to a digital device will always be lossy in some regard, and storing truely analogue data on digital storage is impossible.

    TL,DR: Analogue and digital are inherently different, and mixing the two is only possible through a lot of compromises having to be made.




  • No problem! I’m sorry if I came off as hostile towards analogue machines here. I actually think they’re cool, just not in the way people think they are (“unraveling Moore’s law” is a bit far-fetched, Microsoft.)

    Oh, and some advice for anyone who isn’t too well-versed in technology: The tech industry isn’t known for honesty. For them, hype comes before everything, even profitability. Take any “revolutionary” or “innovative” new piece of tech with a grain of salt, especially now that tech companies are getting a bit goofy with their promises due to investors realizing that unprofitable companies aren’t sustainable.

    EDIT: The two deleted comments are dupilcates of this one that were posted due to a bug in Jerboa. Sorry for any confusion!




  • I wouldn’t say I’m smarter than you, rather I just know some stuff about how computer components work, but what you’re looking at is the latter.

    The problem with trying to move to another type of computer is that modern software is designed solely for digital machines. Considering what’s been stated above, how do you port these programs to another type of computer?

    The answer is that you don’t. Porting to different CPU architectures can already take some time for most programs, but asking for a port to a fundamentally different type of computer will take an incredibly long amount of time.

    That is, if you can even port anything. Considering that digital and analogue computers are completely different, functional clones would have to be made instead by referencing source code. If you don’t have the source, you’re outta luck.

    TL,DR: We’ve over-invested in digital computers and there’s no going back.