https://www.merriam-webster.com/thesaurus/touch
20 years ago it seemed to be there was very little NLP literature on languages other than English, I’d say today I see papers in arXiv every day where people trained an LLM for some “minor” language or do experiments with multi-lingual models, so your question is very much an active research area.
https://arxiv.org/search/?query=multilingual&searchtype=all&...
Another interesting thought related to this is that programming languages often have a spec or grammar made available. Can we help the LLMs learn faster or better by supplying these? How can it draw out the common patterns across languages while being effective with any language on a specific problem? Can they few-shot learn a library in the ecosystem and map that onto the problem / solution? JSON vs JSON5 is an interesting example. I was trying to get ChatGPT to work with CUE, but it kept wanting to produce JSON or Yaml instead.