HACKER Q&A
📣 randomgermanguy

Anyone else disillusioned with "AI experts" in their team?


We had an internal-workshop led by our internal AI-team (mostly just LLMs), and had the horrible realisation that no one in that team actually knows what the term "AI" even means, or how a language model works.

One senior-dev (team-lead also) tried to explain to me that AI is a subfield of machine-learning, and always stochastic in nature (since ChatGPT responds differently to the same prompt).

We/they are selling tailor-made "AI-products" to other businesses, but apparently we don't know how sampling works...? Also, no one could tell me where exactly our "self-hosted" models even ran (turns out 50% of the time its just OpenAI/Anthropic), or what OCR-model our product was using.

Am I just too junior/naive to get this or am I cooked?


  👤 nis0s Accepted Answer ✓
Maybe this is a joke, there’s a lot of talent out there. But if you’re not kidding, start looking for a job somewhere else on the dl, this ship isn’t fit to steer.

👤 zippyman55
I learned a huge amount from my team members, and they were usually smarter than I was. So, sure, I occasionally corrected them and impressed them, but all the bread crumbs they threw out, I gobbled up and learned a ton. But, they never said stupid stuff! (Ok, maybe about social events, but they were geeks). When team members spout off erroneous stuff and it is not "occasionally" you have to question what you are going to be learning from your co-workers. So, your whole team may be inferior and contribute little to your learning and growth. This extends to management and the decisions they make regarding AI.

👤 illwrks
I work in house and had a similar AI agency day over the last few months.

I came to the same observations; lots of experts not much expertise.

I think my wider team are on par with their ability and understanding so we now can sift through the BS a bit easier.

Nod, smile, accept that no one has a clear understanding.


👤 incomingpain
>We had an internal-workshop led by our internal AI-team (mostly just LLMs), and had the horrible realisation that no one in that team actually knows what the term "AI" even means, or how a language model works.

I'm the AI expert for my org. Everyone else is more or less opposed to AI.

>One senior-dev (team-lead also) tried to explain to me that AI is a subfield of machine-learning, and always stochastic in nature (since ChatGPT responds differently to the same prompt).

machine learning is the sub field of AI.

Not really stochastic as far as I know. The whole random seed and temperature thing is a bit of a grey area for my full understanding. Let alone the topk, top p, etc. I often just accept what's recommended from the model folks.

>We/they are selling tailor-made "AI-products" to other businesses, but apparently we don't know how sampling works...?

Sales people dont tend to know jack. That doesnt mean they dont have an introvert in the back who does know what's going on.

>Am I just too junior/naive to get this or am I cooked?

AI for the most part has been out a couple years. With rapid improvement and changes that make 2023 knowledge obsolete. 100% of us are juniors in AI.

You're disillusioned because the "ai experts" basically dont exist.


👤 rvz
> One senior-dev (team-lead also) tried to explain to me that AI is a subfield of machine-learning, and always stochastic in nature (since ChatGPT responds differently to the same prompt).

This "senior dev" has it all mixed up and is incorrect.

"AI" is all encompassing umbrella term that includes other fields of "AI" such as the very old GOFAI (good old fashioned AI) which is rule-based, machine learning (statistical, bayesian) methods, and neural networks which deep learning and more recently generative AI (which ChatGPT) uses.

More accurately, it is neural networks which are more "stochastic" with their predictions and decisions, not just transformer models which ChatGPT is based on.

> Am I just too junior/naive to get this or am I cooked?

Quite frankly, the entire team (except you) is cooked, as you have realized what you don't know.


👤 mathattack
How large is the company?

Many times plum resume building assignments at big companies go to the best politicians rather than the biggest experts.


👤 brailsafe
Eh, probably overthinking it, let it matter if it matters, if it's your liability, but otherwise take the opportunity for what it is. Focus on what you're there to do. The purpose of any of it is to bring money in, and if that happens.

Early on in my career I was hyper-fixated on building features correctly at this particular company, according to what I thought was a proper way to build websites. I was probably right, but my job wasn't to be right, my job was to get things done in a certain period of time according to whatever people who controlled the money at the company thought was important, not what a nerd would necessarily care about.

When you're in school or just graduated, you're basically qualified to start learning (outside academia) and it's important to pay attention to what other people value, then do your best within that until you have the power to determine what's worth valuing.


👤 linkjuice4all
Hey I resemble that remark!

There's definitely a rush of people trying to upskill/reskill into this technology space despite having no formal training or background beyond basic dev skills. There's other people (such as myself) that came from the big data/NLP space (ads & search) that are trying to add AI to our extensive skillsets but aren't necessarily deep-math experts.

Unfortunately there's not a lot of room at the top and the vast majority of AI implementations at smaller companies are just OpenAI API wrappers. Essentially there's very little lived experience since it's expensive to experiment at home and smaller companies just aren't going to invest in self-hosted models that are expensive to run and quickly fall behind state of the art.


👤 nixpulvis
> Also, no one could tell me where exactly our "self-hosted" models even ran (turns out 50% of the time its just OpenAI/Anthropic)

This part is honestly the most worrying to me, as compliance with customers and legal would really need you to not lie about this.


👤 rglover
This is the nature of tech now (perhaps the whole time due to being a relatively "new" field). Most people don't have the slightest clue what they're doing beyond their ability to parrot buzzwords.

Mean? Sure. Reality? You betcha. It's incredibly rare these days to encounter truly competent professionals. Most are just hoping the guy below them doesn't know enough to spot their shortfalls and speak up.

This aligns shockingly well with Uncle Bob's rough stat: “The number of programmers doubles every five years or so. This means that half the programmers in the world have less than five years of experience.”


👤 b-karl
The field of AI has become very big in the recent years and people are becoming more and more specialized just like the in the rest of software or R&D. There’s all sorts of model building and development, integration in other traditional software, infrastructure, deployment and devops and now also all the governance and compliance for a lot of fields. It’s good for people working in those stacks to understand the full chain at some level but pretty fast you will have experts in particular parts of the chain and no one will understand it all.

And then there’s of course career climbers playing politics and people getting into the field because of interest or resume building.


👤 bogomipblips
You are obviously someone who would appreciate pedantic style educators. The average customer is not at all appreciative of not being met where they are which tends to lead to an expert using AI as short for OpenAI, etc, with the customer unless they use a careful hedging..

Similarly if they participated in all the early arguments about where your models would be located then they have no idea now that they are fed up with the endless thread of subtle change requests.


👤 AJRF
Do you think sampling is deterministic?

👤 Insanity
I mean.. to play devil’s advocate.. they don’t need to understand how LLMs fundamentally work any more than how most programmers don’t understand assembly if all they do is build agents and prompt engineering lol.

👤 bawolff
The entire AI ecosystem is a giant hype bubble. I dont really think it matters much if your team understands AI, the bubble is going to pop either way.

👤 nonameiguess
I did my undergrad in applied math and MS in machine learning, worked writing automated trading algorithms for a few years before drifting into infra layer, and there is no universe in which I'd consider myself anywhere remotely close to an expert in AI and I'm really not sure such people exist outside of the senior leadership at major labs, i.e. the LeCun/Hinton types.

But I know enough to know neither AI nor machine learning are subfields of the other. AI just developed out of the very earliest days of electronic computing as an expression of the desire to get intelligent behavior out of computers by any means possible. Machine learning arose from the desire to express functions in which we know the inputs and outputs but not the form of the function itself, so we use various estimation methods that can be learned from the data itself. A whole lot of overlap and parallel efforts simultaneously developed the same or similar techniques between computer scientists and software engineers on the one side and statisticians and applied mathematicians on the other side. It seemed to have turned out that statistical methods generally seem to provide the best algorithms for machine learning, and machine learning has seemed to provide the best algorithms to get intelligent behavior out of computers.

So they've kind of grown together, stats, automated learning, and AI, but they're still distinct things that developed independently of one another and still exist independently of one another.

This is putting aside all the various "big data" technologies and efforts that grew out of the 2007 or so era of collecting enormous amounts of user or machine-generated data that required new tech to store, query, and new ways to perform parallel batch processing often married to the storage and query tech, all of which was necessary for and enabled statistical machine learning to become as successful as it has become, but is completely separate from the mathematical and algorithmic discipline itself.

Even the guys I named above are probably not really experts in all of these things separately. As with anything, it takes a village.


👤 dboreham
LLMs (and ML before that) have attracted a class of hand-waving bullshitter. Hardly surprising --- anyone who knows what they're doing in whatever field is going to be busy doing their thing. Meanwhile some new hot tech comes along, who has the time to poke into it? Mr Useless who never had anything to do. Meanwhile we're digging into the math of transformers and finding it fascinating while they're goofing around with "prompt engineering".

👤 Razengan
About like 6-7 years I was staying in an Airbnb where the host was an Indian doctor, and one of his other guests also happened to be a coder. One evening we were sitting at the shared dining table, and the host brought up if someone could make an "AI medical app"..

He didn't say what the app would do or offer, he just said "an AI medical app hasn't been done before", he just wanted it to be "AI"

Before that I had people ask me if I could help them make a website, or an app, and they wouldn't say what the app/website would actually do, just that it would be nice to have one.


👤 Ancapistani
This is pervasive right now.

I’m at a company, on an AI team. I’ve been in this role for a year and we’ve not delivered a single AI… anything. Worse, we had a failed launch midway through the year and instead of re-evaluating our focus, are doubling down. New people have been brought onboard and the culture has completely shifted.

If anyone is looking for someone with ~20 years of SWE experience who really, really wants to deeply understand internal processes, how they relate to the company’s product and bottom line, and implement with an eye toward efficiency and automation… let me know.