Just curious since it seems everyone is mentioning AI in some form in majority of what I see posted here (maybe just the Reddit “algorithm” at work?). It’s almost like not doing AI is enough of a differentiator at this point to stand out in the market. Is there some crossover between Reddit and AI specifically, or is it a bigger trend that everyone is hopping on the AI bandwagon? Feels like “ai” is a rebrand of “machine learning”/“ml” buzzwords from recent past - a lot of fudging the edges of the actual definition, in what I think is an attempt to seem attractive in the funding market. Is anyone concerned about that backfiring?

  • SnooObjections6359@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This is a very convoluted subject. In general, before ChatGPT, investors would define whether a company was ‘AI’ based on how deep their proprietary models were. In other words, deep learning vs. traditional machine learning was the defining factor. You could argue this isn’t AI in its true form, but by that account, NO COMPANY that I’ve come across is doing ‘true AI.’ Apps like ChatGPT that utilize generative adversarial networks are not AI either by that definition. OpenAI didn’t invent anything. They trained the largest dataset the world has ever seen, but using pre-existing technology. So posing a question back to the op, what does a company have to do to be considered legitimate AI in your opinion? Until we have a universally accepted answer to that question, I don’t see it backfiring on anyone unless you’re just a GPT wrapper or API aggregator.