[1] https://news.ycombinator.com/item?id=26492180
Also, understanding can be fleeting if you don't think about a certain topic all the time. Take relativity for example: I recall a very specific moment when I was talking to a friend who is actually formally educated in Physics, and we were talking about the speed of light, time dilation, mass expansion, etc. And I had a definite "aha" moment when I understood why mass has to increase with velocity and why time dilation had to occur. But... could I explain it to somebody else now (that was probably 10 years ago), and convey the same understanding? Do I even understand it as well now as I did in that moment? Arguably not. :-(
OTOH, if you just ask "What's a topic that you know a lot about?" I'd say, generally speaking, "firefighting" and "computer programming, especially in Java". shrug
So this is related to some other things, like the Weisfeiler-Leman Algorithm, and Morgan numbers, and partition refinement. The idea is that to compare two graphs (networks, or Eulerian graphs) for equality one way to do that is to 'hash' the graph. The resulting canonical representations of the graphs are then compared directly for equality.
I implemented a couple of different approaches for doing this - one based on a research paper calling the technique 'signatures', and the other based on an algorithm from a book called 'C.A.G.E.S' (can give references in a bit).
For me, the process of understanding how to achieve this meant learning about permutations, group theory, partition refinement, applying permutations to graphs, and so many other things. I think all told it took me more than a year to properly understand and implement it all!
Very fun, but maybe not the best use of time, looking back...
Though I have to admit an ever increasing number of details are getting fuzzy as they fade into the rear-view mirror of my attention.
As a foreigner fluent in German, with a very diverse set of German friends, and years of experience living there, in three very different regions at different stages of my life, I will make the bold claim that I understand Germany, even if I probably understood it more in the past than I will in the future.
I’m sure there are many people who understand it better, deeper, more completely; but I’m not sure any of them are Germans. Ask the fish about water, etc.
I think this kind of sounds like hubris. I plead guilty, but with the mitigating circumstance that my claim to understanding isn’t any more hubristic than the next guy’s.
Austria, on the other hand, is a complete mystery to me.
I once built a computer. I had Alan Clements "Microprocessor Systems Design" for an undergraduate text and built a 68000 based machine with 64M of RAM and ROM. Using discrete ICs on two big Eurorack wirewrap cards was hard work and needed good eyesight and dexterity. Then I wrote a bootloader and kermit-like like RS232 program loader, and finally hashed together a really basic OS from Tannenbaum (with a synchronous scheduler and fixed job table). A long summer of late nights. Can I say "I understand computers". No way! But I feel confident I know more about them than most people I'll ever meet.
Another heroic adventure was "Linux from Scratch", compiling everything along the way with tinycc.
These things are rites of passage. They don't mean I understand those things, and even if I did, that knowledge is obsolete today, but I'm glad I did them.
Essentially all those instruments healthcare companies invented to make coverage and payout rules complex.
The Prime architecture was based on the Multics design from a group of guys hailing from Honeywell and MIT. I worked on Primes for around 12 years, worked for Prime briefly in the 80's, and wrote an emulator that runs many versions of Prime's OS, Primos: https://github.com/prirun/p50em
Prime systems are very much the opposite of RISC, with instructions for:
- process exchange and task scheduling
- procedure call, incl arguments, ring crossing, stack handling, register saves
- decimal arithmetic, incl picture editing for COBOL & PL/I
- string instructions and picture editing
- 6 CPU modes for compatibility with older processors
- DMA, DMC, DMT, DMQ I/O modes for controller access to memory
- 32, 64, and 128-bit floating point
I got a book published with a bunch of community and personal findings as an "I wish I had known this earlier" kind of collection.
* Giant disclaimer: Bash is insanely complex, mostly for historical reasons. There are infinite ways of doing basically anything, and all of them have different trade-offs. The big one usually being between the complexity of the code and the kind of situations it can handle.
The science of recreational drugs are interesting to me.
... And no. It's not all my fault.
[1] - https://www.gov.uk/government/organisations/lyons-inquiry-in...
Also microinstruction vulnerabilities of CPUs.
Some people don't understand things, that is not the same as understanding nothing.
I also understand ashtanga and programming.
- Unicode and characters encodings
- Native WebGL
- CSS3D
- Women
I lied about women.
It was hard for me to understand this arbitrary rule of things becoming less ordered over time. Was this just a fundamental natural law?
The answer is no. Entropy is a logical consequence of probability and time.
Why do things become more chaotic over time? Because chaotic configurations have a higher probability of occurring.
There are far more disordered configurations of things then there are ordered, this is why things become more disorder with time. Time changes the configuration. And by probability a high probability configuration is more likely to occur then a low probability configuration. So the axiom of nature here is not entropy, it's probability. Entropy is just a consequence of probability.
There are systems where ordered configurations are more numerous then disordered configuration and in those systems things become more ordered with time. In these cases entropy is STILL defined to be increasing as things get more ordered. Thus entropy is not describing disorder, or is it?
The thing I don't fully understand yet is heat entropy. Apparently when you take heat into account for everything, the disordered intuition suddenly becomes applicable. So if you have a system becoming more ordered with time, heat must be increasing somewhere to offset this increase in order. Maybe someone can explain this part to me?