I had coffee with a guy in Florentin last month. Senior backend engineer, twelve years at a payments company that pays well. He'd been job-hunting for four months. Zero callbacks.
He showed me his CV. It was good. Java, Kotlin, microservices, the usual stack, real numbers in the bullet points, a Master's from the Technion. The kind of CV that should have phones ringing.
"What's going wrong?" he asked.
I asked him a question back. "When did you last open ChatGPT?"
He looked at me like I'd asked when he last brushed his teeth with a fork. "I don't really use it. Why?"
That was the conversation. Four months of silence on his applications, and the answer was sitting on his iPhone, unopened.
What changed in 2025-2026
Here's the part nobody is saying out loud. Six months ago, "experience with AI tools" was a bonus line on a job posting. Something HR copy-pasted into the requirements section because the CTO said it should be there. Nobody checked.
Now it's a filter.
I've talked to recruiters at Check Point, Wix, Monday, Mobileye, three or four startups in Ramat Gan. The story is the same everywhere. They're getting 200-400 CVs per opening. They have to throw out 80% of them in the first pass. The new heuristic, the one that wasn't there last year:
If the candidate doesn't mention AI tools they actually use, the CV goes in the no-pile.
Not "AI experience." Not "ML certification." Not "trained models on PyTorch." Just: do you, the working engineer, use AI in your day job, and can you tell me about it in concrete terms.
If you can't, you're getting filtered out before a human reads a single line about your real work.
What "minimum" actually means
Forget the LinkedIn influencer takes. Here's what an Israeli hiring manager actually wants to hear when they ask "do you use AI in your work":
One coding assistant, used daily. Cursor, Claude Code, Copilot, Windsurf — pick one. Use it for two weeks. The interviewer wants to hear "yeah, I use Cursor for boilerplate and Claude Code when I need to refactor across files." Specific. Boring. Real.
One LLM chat, used weekly for non-code work. ChatGPT, Claude, Gemini. For drafting emails, summarizing PRs, walking through a system design before you start writing it. Not for jokes. For work.
A general sense of what these tools can't do. This one matters more than people think. The person who says "AI is amazing, it does everything for me" gets red-flagged. The person who says "I use it for X and Y but I never trust it for Z because it hallucinates database schemas" gets respect. Knowing the limits proves you've actually used it.
One concrete project where AI saved you time. Doesn't have to be impressive. "I had to write a one-off CSV parser, normally takes me 30 minutes, Claude Code wrote it in 2." That's enough. The point is the story, not the scale.
That's the minimum. Not a bootcamp. Not a certification. Two weeks of actually using the tools, paying attention to where they help and where they fail.
What you don't need
You don't need to know how transformers work. Nobody is asking you to retrain a model.
You don't need a "Generative AI" certification. They're mostly worthless. The hiring manager has seen 40 of them this month.
You don't need to be a "prompt engineer." That title is going to be embarrassing in a year. Stop putting it on your CV.
You don't need to use AI for everything. The person who says "I have Claude write all my code now" is signalling they don't know what they're doing. The honest answer is some-of-the-time.
The Israeli market specifics
A few things that are weirdly local:
Hebrew prompting is now a soft skill. I'm not joking. Half the work happens in mixed Hebrew-English contexts (Slack, internal docs, Hebrew customer feedback). The candidate who can write a useful prompt in Hebrew that returns a useful response wins the room. Most people can't. The English-only people switch back to Hebrew as soon as they're embarrassed about their grammar.
The army connection still matters. If you came out of 8200 or 81 and your unit used AI tools, say so. Don't be vague. Specifics get respect. The HR person knows which units actually used what.
Israeli tech is paranoid about IP leakage. If you're being interviewed and they ask about AI workflow, they want to hear you say "I never paste production code into the public chat — we use the enterprise plan with no training, or I run it locally." That answer alone separates you from half the field.
The honest part
I should say this clearly because nobody else does. The Israeli tech market right now is brutal. The juniors are getting nothing — three years of experience and you're competing with five hundred people who also have three years of experience and now also use Cursor. The mid-levels with 5-10 years are mostly fine if they update their toolkit. The seniors are mostly fine because the job market is desperate for people who can review what AI writes.
If you're in the brutal slice (junior, or mid-level who hasn't updated in two years), the AI thing is the cheapest fix you can make right now. Two weeks of practice. One concrete project. Update the CV. The next round of applications will land different.
I told my Florentin friend this. He spent the next two weeks pairing with Cursor on a personal project, rewrote his CV with three concrete AI-workflow bullets in his current role, sent applications to the same companies again.
He had two callbacks in the first week. He's interviewing at one of them now.
It wasn't talent that changed. It was the language he was using to describe his work. The work was already good.
If you want a calibrated read on how your own profile lands with Israeli recruiters, the Korotchaim Browser Companion runs on your LinkedIn profile and tells you which sections need the AI-context update first. Free to install, walks you through fixes in five minutes.
Related: Can AI Replace You? An Honest Answer · How to Find Work in Israeli Tech in 2026