randycupertino 17 hours ago

> "Nvidia isn't alone, as tech giants have taken measures to push employees to incorporate more AI into their day-to-day work. Both Microsoft and Meta plan to evaluate employees based on their AI usage, and Google told engineers to use AI for coding, Business Insider reported. Amazon was in talks to adopt the AI coding assistant Cursor after employees requested it…"

My company also told us they are tracking how much we use AI and how much we use it will be factored into our yearly performance reviews.

  • francisofascii 14 hours ago

    It is interesting because plenty of organizations ban the use of AI in many situations. One client I work for blocks Copilot in VSCode when on their VPN.

jqpabc123 16 hours ago

Anyone with half a brain knows that AI is unreliable. It can and will make mistakes.

Who gets the blame for this?

It's like insisting that managers hire unreliable people because they're cheaper --- but the managers know they will pay the price for doing so.

  • zcw100 14 hours ago

    Ha! When was the last time anyone took the blame for crappy code? This is an industry with zero accountability for quality. Fail fast right? At least when I tell an LLM it's wrong it says, "You're absolutely right" and gets to fixing it rather than an hour lecture about why they're totally correct and justified because of their version of "best practices".

    • jqpabc123 11 hours ago

      LLMs are being used for a lot more than code generation.

  • thegrim33 10 hours ago

    Well the point is the manager gets the praise/promotion/etc for reducing costs and supposedly improving performance, and then they bounce and leave the company, moving on to the next place, before the long term effects can be evaluated.

  • lux-lux-lux 13 hours ago

    > Who gets the blame for this?

    The computer, which cannot be held accountable. See how that works?

cratermoon 16 hours ago

If AI were actually any good programmers would have to sneak it in the backdoor, without the knowledge of management and their "approved software", like we do with almost any tool we find that really works.

windows2020 9 hours ago

Makes sense an 'AI' chip maker would say that.

64718283661 13 hours ago

I would never install these ai tools on my computer. It's going to immediately scan and upload my source code. Why would I want them to steal my code? Nothing good can come from that.

cjbenedikt 15 hours ago

Hmmm...he almost comes across as desperate. Wonder why...

stalfosknight 17 hours ago

Why must executives be so fucking obnoxious about shoving AI into every possible orifice?

  • davesque 14 hours ago

    Because there's a lot of money in it.

  • chillingeffect 17 hours ago

    >And if AI does not work for a specific task, "use it until it does," he added.