• RvTV95XBeo@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 months ago

    Maybe I’m just getting old, but I honestly can’t think of any practical use case for AI in my day-to-day routine.

    ML algorithms are just fancy statistics machines, and to that end, I can see plenty of research and industry applications where large datasets need to be assessed (weather, medicine, …) with human oversight.

    But for me in my day to day?

    I don’t need a statistics bot making decisions for me at work, because if it was that easy I wouldn’t be getting paid to do it.

    I don’t need a giant calculator telling me when to eat or sleep or what game to play.

    I don’t need a Roomba with a graphics card automatically replying to my text messages.

    Handing over my entire life’s data just so a ML algorithm might be able to tell me what that one website I visited 3 years ago that sold kangaroo testicles was isn’t a filing system. There’s nothing I care about losing enough to go the effort of setting up copilot, but not enough to just, you know, bookmark it, or save it with a clear enough file name.

    Long rant, but really, what does copilot actually do for me?

    • sem@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Before ChatGPT was invented, everyone kind of liked how you could type in “bird” into Google Photos, and it would show you some of your photos that had birds.

    • wetbeardhairs@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      They’re great for document management. You can let it build indices, locally on your machine with no internet connection. Then when you want to find things you can ask it in human terms. I’ve got a few gb of documents and finding things is a bitch - I’m actually waiting on the miniforums a1 pro whatever the fuck to be released with an option to buy it without windows (because fuck m$) to do exactly this for our home documents.

      • self@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        a local search engine but shitty, stochastic, and needs way too much compute for “a few gb of documents”, got it, thanks for chiming in

      • RvTV95XBeo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Offline indexing has been working just fine for me for years. I don’t think I’ve ever needed to search for something esoteric like “the report with the blue header and the photo of 3 goats having an orgy”, if I really can’t remember the file name, or what it’s associated with in my filing system, I can still remember some key words from the text.

        Better indexing / automatic tagging of my photos could be nice, but that’s a rare occurrence, not a “I NEED a button for this POS on my keyboard and also want it always listening to everything I do” kind of situation

        • self@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          I wish that offline indexing and archiving were normalized and more accessible, because it’s a fucking amazing thing to have

    • Flipper@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Apparently it’s useful for extraction of information out of a text to a format you specify. A Friend is using it to extract transactions out of 500 year old texts. However to get rid of hallucinations the temperature reds to be 0. So the only way is to self host.

      • daellat@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Well, LLMs are capable (but hallucinant) and cost an absolute fuckton of energy. There have been purpose trained efficient ML models that we’ve used for years. Document Understanding and Computer Vision are great, just don’t use a LLM for them.

      • OhNoMoreLemmy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Setting the temperature to 0 doesn’t get rid of hallucinations.

        It might slightly increase accuracy, but it’s still going to go wrong.

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      same here, i mostly dont even use it on the phone. my bro is into it thought, thinking ai generate dpicture is good.

      • RvTV95XBeo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        It’s a fun party trick for like a second, but at no point today did I need a picture of a goat in a sweater smoking three cigarettes while playing tic-tac-toe with a llama dressed as the Dalai Lama.

        • bampop@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          It’s great if you want to do a kids party invitation or something like that

          • meowMix2525@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            That wasn’t that hard to do in the first place, and certainly isn’t worth the drinking water to cool whatever computer made that calculation for you.

    • ByteJunk@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I use it to speed up my work.

      For example, I can give it a database schema and ask it for what I need to achieve and most of the time it will throw out a pretty good approximation or even get it right on the first go, depending on complexity and how well I phrase the request. I could write these myself, of course, but not in 2 seconds.

      Same with text formatting, for example. I regularly need to format long strings in specific ways, adding brackets and changing upper/lower capitalization. It does it in a second, and really well.

      Then there’s just convenience things. At what date and time will something end if it starts in two weeks and takes 400h to do? There’s tools for that, or I could figure it out myself, but I mean the AI is just there and does it in a sec…

      • self@awful.systems
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        it’s really embarrassing when the promptfans come here to brag about how they’re using the technology that’s burning the earth and it’s just basic editor shit they never learned. and then you watch these fuckers “work” and it’s miserably slow cause they’re prompting the piece of shit model in English, waiting for the cloud service to burn enough methane to generate a response, correcting the output and re-prompting, all to do the same task that’s just a fucking key combo.

        Same with text formatting, for example. I regularly need to format long strings in specific ways, adding brackets and changing upper/lower capitalization. It does it in a second, and really well.

        how in fuck do you work with strings and have this shit not be muscle memory or an editor macro? oh yeah, by giving the fuck up.

        • CarrotsHaveEars@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          4 months ago

          (100% natural rant)

          I can change a whole fucking sentence to FUCKING UPPERCASE by just pressing vf.gU in fucking vim with a fraction of the amount of the energy that’s enough to run a fucking marathon, which in turn, only need to consume a fraction of the energy the fucking AI cloud cluster uses to spit out the same shit. The comparison is like a ping pong ball to the Earth, then to the fucking sun!

          Alright, bros, listen up. All these great tasks you claim AI does it faster and better, I can write up a script or something to do it even faster and better. Fucking A! This surge of high when you use AI comes from you not knowing how to do it or if even it’s possible. You!

          You prompt bros are blasting shit tons of energy just to achieve the same quality of work, if not worse, in a much fucking longer time.

          And somehow these executives claim AI improves fucking productivity‽

          • Hexarei@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            4 months ago

            The only things I’ve seen it do better than I could manage with a script or in Vim are things that require natural language comprehension. Like, “here’s an email forwarded to an app, find anything that sounds like a deadline” or “given this job description, come up with a reasonable title summary for the page it shows up on”… But even then those are small things that could be entirely omitted from the functionality of an app without any trouble on the user. And there’s also the hallucinations and being super wrong sometimes.

            The whole thing is a mess

          • self@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            exactly. in Doom Emacs (and an appropriately configured vim), you can surround the word under the cursor with brackets with ysiw] where the last character is the bracket you want. it’s incredibly fast (especially combined with motion commands, you can do these faster than you can think) and very easy to learn, if you know vim.

            and I think that last bit is where the educational branch of our industry massively fucked up. a good editor that works exactly how you like (and I like the vim command language for realtime control and lisp for configuration) is like an electrician’s screwdriver or another semi-specialized tool. there’s a million things you can do with it, but we don’t teach any of them to programmers. there’s no vim or emacs class, and I’ve seen the quality of your average bootcamp’s vscode material. your average programmer bounces between fad editors depending on what’s being marketed at the time, and right now LLMs are it. learning to use your tools is considered a snobby elitist thing, but it really shouldn’t be — I’d gladly trade all of my freshman CS classes for a couple semesters learning how to make vim and emacs sing and dance.

            and now we’re trapped in this industry where our professionals never learned to use a screwdriver properly, so instead they bring their nephew to test for live voltage by licking the wires. and when you tell them to stop electrocuting their nephew and get the fuck out of your house, they get this faraway look in their eyes and start mumbling about how you’re just jealous that their nephew is going to become god first, because of course it’s also a weirdo cult underneath it all, that’s what happens when you vilify the concept of knowing fuck all about anything.

      • Samskara@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        adding brackets and changing upper/lower capitalization

        I have used a system wide service in macOS for that for decades by now.

      • V0ldek@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        changing upper/lower capitalization

        That’s literally a built-in VSCode command my dude, it does it in milliseconds and doesn’t require switching a window or even a conscious thought from you

      • morbidcactus@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Gotta be real, LLMs for queries makes me uneasy. We’re already in a place where data modeling isn’t as common and people don’t put indexes or relationships between tables (and some tools didn’t really support those either), they might be alright at describing tables (Databricks has it baked in for better or worse for example, it’s usually pretty good at a quick summary of what a table is for), throwing an LLM on that doesn’t really inspire confidence.

        If your data model is highly normalised, with fks everywhere, good naming and well documented, yeah totally I could see that helping, but if that’s the case you already have good governance practices (which all ML tools benefit from AFAIK). Without that, I’m totally dreading the queries, people already are totally capable of generating stuff that gives DBAs a headache, simple cases yeah maybe, but complex queries idk I’m not sold.

        Data understanding is part of the job anyhow, that’s largely conceptual which maybe LLMs could work as an extension for, but I really wouldn’t trust it to generate full on queries in most of the environments I’ve seen, data is overwhelmingly super messy and orgs don’t love putting effort towards governance.

        • jacksilver@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          I’ve done some work on natural language to SQL, both with older (like Bert) and current LLMs. It can do alright if there is a good schema and reasonable column names, but otherwise it can break down pretty quickly.

          Thats before you get into the fact that SQL dialects are a really big issue for LLMs to begin with. They all looks so similar I’ve found it common for them to switch between them without warning.

          • morbidcactus@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            Yeah I can totally understand that, Genie is databricks’ one and apparently it’s surprisingly decent at that, but it has access to a governance platform that traces column lineage on top of whatever descriptions and other metadata you give it, was pretty surprised with the accuracy in some of its auto generated descriptions though.

            • jacksilver@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 months ago

              Yeah, the more data you have around the database the better, but that’s always been the issue with data governance - you need to stay on top of that or things start to degrade quickly.

              When the governance is good, the LLM may be able to keep up, but will you know when things start to slip?

      • Hudell@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        I use it to parse log files, compare logs from successful and failed requests and that sort of stuff.

      • sem@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        The first two examples I really like since you’re able to verify them easily before using them, but for the math one, how to you know it gave you the right answer?

    • Don_alForno@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Our boss all but ordered us to have IT set this shit up on our PCs. So far I’ve been stalling, but I don’t know how long I can keep doing it.

    • AbsentBird@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      The only feature that actually seems useful for on-device AI is voice to text that doesn’t need an Internet connection.

      • RvTV95XBeo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        As someone who hates orally dictating my thoughts, that’s a no from me dawg, but I can kinda understand the appeal (though I’ll note offline TTS has been around for like a decade pre-AI)

        • froztbyte@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          longer: dragon dictate and similar go back to the mid 90s (and I bet the research goes back slightly earlier, not gonna check now)

          similar for TTS

      • zurohki@aussie.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        I tried feeding Japanese audio to an LLM to generate English subs and it started translating silence and music as requests to donate to anime fansubbers.

        No, really. Fansubbed anime would put their donation message over the intro music or when there wasn’t any speech to sub and the LLM learned that.

      • Dragonstaff@leminal.space
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        We’ve had speech to text since the 90s. Current iterations have improved, like most technology has improved since the 90s. But, no, I wouldn’t buy a new computer with glaring privacy concerns for real time subtitles in movies.

      • Bytemeister@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        You’re thinking too small. AI could automatically dub the entire movie while mimicking the actors voice while simultaneously moving their lips and mouth to form the words correctly.

        It would just take your daily home power usage to do a single 2hr movie.

      • turtlesareneat@discuss.online
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        Depends on the implementation.

        Just about everyone I know loves how iPhones can take a picture and readily identify a plant or animal. That’s actually delightful. Some AI tech is great.

        Now put an LLM chatbox where people expect a search box, and see what happens… yeah that shit sucks.

      • Hudell@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Whenever I ask random people who are not on IT, they either don’t know about it or they love it.

        • RedditIsDeddit@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          I work in IT and have recently been having a lot of fun leveraging AI in my home lab to program things as well as doing audio\video generation (which is a blast honestly.) So… I mean, I think it really depends on how it’s integrated and used.

          • froztbyte@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            “I work in IT” says the rando, rapaciously switching between support tickets in their web browser and their shadow-IT personal browser

            “I’ve been having a lot of fun” continues the rando, in a picture-perfect replica of every other fucking promptfan posting the same selfish egoist bullshit

            “So… I mean, I think it really depends on how it’s integrated and used” says thee fuckwit, who can’t think two words beyond their own fucking nose

            • RedditIsDeddit@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              4 months ago

              ““I work in IT” says the rando, rapaciously switching between support tickets in their web browser and their shadow-IT personal browser” says the ignoramus that hasn’t left his house in weeks and trolls people for fun.

              ““I’ve been having a lot of fun” continues the rando, in a picture-perfect replica of every other fucking promptfan posting the same selfish egoist bullshit” Says the moron with no context that is making assumptions about someone with very little information and apparently no worldly knowledge.

              ““So… I mean, I think it really depends on how it’s integrated and used” says thee fuckwit, who can’t think two words beyond their own fucking nose” Says the elitist that literally contributes nothing positive to the world.

              Go touch some grass

              • froztbyte@awful.systems
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                4 months ago

                look, I’ll do you the disfavour of giving you an actually detailed reply

                you know exactly fucking nothing about me, about what I do, and about my competencies. if you did just a liiiiiiittle bit of work you might get an inkling, but: I know you didn’t, and I know you don’t.

                that’s not a judgement, that’s just fact.

                trying to flippantly rage at my derision of your shitty post… I mean, points for effort? but… be more interesting…? you’re factory-line-identical outrage, and it’s boring

                para (2): sure, I made some inferred guesses. still don’t think I’m wrong (and your little tagline ragefest there isn’t helping, either)

                paras (1) and (3): I lul. once again, if you knew anything about me…

                but sure, go off queen. I’m sure your emanated bilge will be received with vim and verve.

  • TheThrillOfTime@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    AI is going to be this eras Betamax, HD-Dvd, or 3d TV glasses. It doesn’t do what was promised and nobody gives a shit.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Betamax had better image and sound, but was limited by running time and then VHS doubled down with even lower quality to increase how many hours would fit on a tape. VHS was simply more convenient without being that much lower quality for normal tape length.

      HD-DVD was comparable to BluRay and just happened to lose out because the industry won’t allow two similar technologies to exist at the same time.

      Neither failed to do what they promised. They were both perfectly fine technologies that lost in a competition that only allows a single winner.

      • GenosseFlosse@feddit.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Afaik betamax did not have any porn content, which might have contributed to the sale of VHS systems.

      • xkbx@startrek.website
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        BluRay was slightly better if I recall correctly. With the rise in higher definition televisions, people wanted to max out the quality possible, even if most people (still) can’t tell the difference

        • BeNotAfraid@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Not just that, space. BluRays have way more space than DVD’s. Remember how many 360 games came with multiple discs? Not a single PS3 game did, unless it was a bonus behind the scenes type thing.

          • Rose@slrpnk.net
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            4 months ago

            Xbox 360 used DVDs for game discs and could play video DVDs. They “supported” HDDVDs - you needed an addon which had a separate optical drive in it. Unsurprisingly this didn’t sell well.

        • bus_factor@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          That’s not why it won, though. It won because the industry wanted zone restrictions, which only Blu-Ray supported. They suck for the user, but allows the industry to stagger releases in different markets. In reality it just means that I can’t get discs of most foreign films, because they won’t work in my player.

          • Revan343@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            I’m sure that was a factor, but Blu-ray won because the most popular Blu-ray player practically sold itself

            • bus_factor@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 months ago

              It’s hard to say what was the final nail in the coffin, but it is true that Blu-Ray went from underdog to outselling HD-DVD around the time the PlayStation 3 came out. I’m not sure how much those early sales numbers matter, though, because I’m sure both were still miniscule compared to DVD.

              When 20th Century Fox dropped support for HD-DVD, they cited superior copy protection as the reason. Lionsgate gave similar sentiment.

              When Warner later announced they were dropping HD-DVD, they did cite customer adoption as the reason for their choice, but they also did it right before CES, so I’m pretty sure there were some backroom deals at play as well.

              I think the biggest impact of the PlayStation 3 was accelerating adoption of Blu-Ray over DVD. Back when DVD came out, VHS remained a major player for years, until the year there was a DVD player so dirt cheap that everyone who didn’t already have a player got one for Christmas.

          • Gerudo@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            The big plus for HD DVD was it was far cheaper to produce, it didn’t need massive retooling for manufacturing.

    • RedSnt 👓♂️🖥️@feddit.dk
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I was just about to mention porn and how each new format of the past came down to that very same factor.
      If AI computers were incredible at making AI porn I bet you they’d be selling a lot better haha

    • blarth@thelemmy.club
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      No, I’m sorry. It is very useful and isn’t going away. This threads is either full of Luddites or disingenuous people.

      • self@awful.systems
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        nobody asked you to post in this thread. you came and posted this shit in here because the thread is very popular, because lots and lots of people correctly fucking hate generative AI

        so I guess please enjoy being the only “non-disingenuous” bootlicker you know outside of work, where everyone’s required (under implicit threat to their livelihood) to love this shitty fucking technology

        but most of all: don’t fucking come back, none of us Luddites need your mid ass

    • froztbyte@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      “coprocessors, but matrix-math specific”

      the various *PUs are “things that help a lot of ML models run faster” sidecar chipset designs

      it’s actually kinda hard to get concrete details, afaict. I’ve had a bit of a look around for silicon teardowns and shit, and haven’t really found any good ones yet

      • ssillyssadass@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        I bet they’re not even special, just normal computers with some low-power AI soft and a chunky price tag to match the hardware they totally sport.

        • froztbyte@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          nah, there’s definitely an actual something there - there’s concrete actual physical extra silicon, with a design and a (marketing?) purpose

  • yarr@feddit.nl
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    These “AI Computers” are a solution looking for a problem. The marketing people naming these “AI” computers think that AI is just some magic fairy dust term you can add to a product and it will increase demand.

    What’s the “killer features” of these new laptops, and what % price increase is it worth?

  • besselj@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 months ago

    Can the NPU be used for practical purposes other than generative AI? If not, I don’t need it.

    • Barbecue Cowboy@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      It depends on how broad you are with the definition. There are a lot of uses for an NPU that aren’t hosting your own chatgpt analog.

        • Barbecue Cowboy@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          You are right that they are effectively limited use GPUs, but that doesn’t make them useless. That’s really the point, I’d agree it is a weird fit for laptops though.

          My personal interest is object detection / facial recognition which is a function a lot of AIs have, but you may or may not classify it as ‘Generative AI’ on it’s own.

          • self@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            you may or may not classify it as ‘Generative AI’ on it’s own.

            while the ship has sailed on calling the opencv shit you’re doing AI (thx, grifters of the first AI bubble), which part of object detection and facial recognition is generative?

  • merdaverse@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    What is even the point of an AI coprocessor for an end user (excluding ML devs)? Most of the AI features run in the cloud and even if they could run locally, companies are very happy to ask you rent for services and keep you vendor locked in.

    • dreugeworst@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      afaict they’re computers with a GPU that has some hardware dedicated to the kind of matrix multiplication common in inference in current neural networks. pure marketing BS because most GPUs come with that these days, and some will still not he powerful enough to be useful

      • blarth@thelemmy.club
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        This comment is the most importantly one in this thread. Laptops already had GPUs. Does the copilot button actually result in you conversing with an LLM locally or is inference done in the cloud? If the latter, it’s even more useless.

      • Gutek8134@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        IDK if the double pun was intended, but a FLOPS is a measurement of how many (floating point) operations can a computer make per second

  • yesman@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    Even non tech people I talk to know AI is bad because the companies are pushing it so hard. They intuit that if the product was good, they wouldn’t be giving it away, much less begging you to use it.

    • jonhendry@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      It’s partly that and partly a mad dash for market share in case the get it to work usefully. Although this is kind of pointless because AI isn’t very sticky. There’s not much to keep you from using another company’s AI service. And only the early adopter nerds are figuring out how to run it on their own hardware.

    • lev@slrpnk.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      You’re right - and even if the user is not conscious of this observation, many are subconsciously behaving in accordance with it. Having AI shoved into everything is offputting.

      • k0e3@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Speaking of off-putting, that friggin copilot logo floating around on my Word document is so annoying. And the menu that pops up when I paste text — wtf does “paste with Copilot” even mean?

        • Rekorse@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          4 months ago

          They are trying to saturate the user base with the word copilot. At least microsoft isnt very sneaky about anything.

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      customers dont want AI, but only thhe corporation heads seem obssed with it.

  • normalexit@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    If I want at AI I have a multitude of options. It’s baked into my editors and easily available on the web. I just paste some crap into a text box and we’re off to the races.

    I don’t want it in my OS. I don’t want it embedded in my phone. I’ll keep disabling it as long as that is an option.

  • TommySoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    I don’t even want Windows 11 specifically because of AI. It’s intrusive, unnecessary, and the average person has no use for it. The only time I have used AI for anything productive was when I needed to ask some very obscure questions for Linux since I’m trying to get rid of Windows entirely.

    • morbidcactus@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Seriously missed an opportunity to being that back as their agent.

      Legitimately though, Cortana was pretty great. There was a feature to help plan commutes (before I went more or less full remote), all it really did was watch traffic and adjust a suggest time to depart but it was pretty nice.

      Say it every time someone mentions WP7/8/10, those lumia devices were fantastic and I totally miss mine, the 1020 had a fantastic camera on it, especially for the early 2010s

      • lohky@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        I loved my Lumia. I have the windows phone launcher on my phone currently haha

    • filcuk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Bad news for people who use google: they’ve removed the same feature, so their assistant is more useless than Cortana a decade ago (only a mild exaggeration)

    • aio@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      As technology advanced, humans grew accustomed to relying on the machines.

        • gen/Eric Computers@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          The fuck? How are you so completely oblivious to the joke in my comment?

          It’s almost like you failed to read the comment I replied to and/or failed to comprehend its meaning towards the original post.

          Maybe out of context, my comment was “weird,” but with context, it’s just a joke that went so far over your head that I’m surprised you even have the brain power to type a reply.

      • blakestacey@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Comment removed for being weird (derogatory). I refrained just barely from hitting the “ban from community” button on the slim chance it was a badly misfired joke from a person who can otherwise behave themself, but I won’t object if any other mod goes ahead with the banhammer.

        • froztbyte@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          I was vacillating on reply harshness, but “hey tell me when you’re home” … some people just need to get better with informed consent.

          (agree that this probably wasn’t banworthy (yet), but close)

          • self@awful.systems
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            it was some shitty follow-up to your joke so unfunny it made your post less funny just by being under it. pull this thread from mastodon and chances are you’ll see it if you really want to

            anyway if you want a laugh, they threw a tantrum and reported your post because we deleted theirs:

            their joke was in that exact tone too because they’re a comedy black hole

        • gen/Eric Computers@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          4 months ago

          Did none of y’all read the comment we’re all replying to?

          Like, did you just completely miss the joke? It went so far over your head that I’m not sure you even understand what the definition of a joke is.

          Yes, out of context, my comment did sound “weird,” but if you stopped to read the comment I’m replying to and could comprehend its meaning towards the original post, then maybe you could see the joke and kinda see where I was going with it.

          I guess people like you are why this instance is called “awful systems.”

    • gen/Eric Computers@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      [Insert “joke” pretending to be Cortana or another “assistant,” but I’m actually a human and I’m going to remind the user to do the “weird” action he requestedv which is apparently completely ok to post, but I’ll get my comment removed and threatened to be banned here]

      • self@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        it’s really weird that this turned into a tantrum where you tried to report other users for their jokes???