Yet from software prespective, I see tech is still relaying on C,C++ and the childres of it.
Why are there no new development, no new programming languages in this area?
One thing that would be nice is a language that transparently targets CPUs + GPUs. Right now you still need to shift gear and use a different language and toolset to run code on GPUs. Perhaps this could be achieved using byte code and a virtual machine. Numba does that for Python.