Schol-R-LEA wrote:
I get the impression that either you didn't read my later post, or you didn't understand it.
Let me put it another way: the majority of working mathematicians never work with any actual numbers except for a handful of fundamental constants such as one, zero, e,i, and π - and they almost never reference the actual values of those constants, just the relationships they have to the logical entities they are working with. It is rare for a mathematician to have to compute a value for anything.
There are well-known stories of mathematicians and theoretical physicists working on the Manhattan Project complaining about the fact that they suddenly were confronted with actual values, and how unaccustomed to that most of them were.
That makes as much sense as someone writing source code, but here the exciting thing is that we only use our brain. Somebody formulates theorems and other concepts instead of programs, and somebody else uses them, but there always needs to be at least logical calculations in thousands of years of meditation.
Schol-R-LEA wrote:
It is no accident that the words 'algorism' (the process of computing by means of transforms on Arabic numerals, which was a radical thing when it was introduced to Europe in the 15th century) and 'algorithm' (a much more modern - mid-19th century - word derived from 'algorism', meaning a description of the process by which a specific value could be computed) were relatively obscure before the introduction of modern calculating hardware.
Mathematics isn't about computation. The idea that everything in a computer system is mathematical is laughably wrong, and does a disservice to both mathematics and computers.
Current CPUs, and even more existing software (at least the part understood by most people), is not as complete and modern as Math itself. Mathematics needs to be introduced fully and formally into computing and computers to be able to perform massive mathematical analysis natively, without libraries or with optional ones, but with all functions implemented in the CPU/FPU.
And well, I can't think of something that isn't mathematical in a computer. Maybe it isn't even algebraic, trigonometric or even arithmetic, maybe only sequential, but if it's numerical and if it needs to be accessed numerically, then it's purely mathematical:
- Strings are mathematical values with ranges and sizes, which can be compared for any imaginable purpose. There could easily be string characters with floating point values for some special purpose.
- Bytes are just a series of values from 0 to 255, or from -128 to 127.
- I/O and memory ranges are numerical.
- Comparisons substract two values to see if they are ==, <, <=, > or >=; or at the very least operate with bits in practically the same way, purely numerical.
- Images, sound and codes like keystrokes/scan codes, device commands, etc., as well as the algorithms to encode, compress and store, are all numerical at least at a basic level. They all need to compare numbers, character or string values and nothing else, apart from logic/loops/functions/mathematical series, which also use numerical addresses, counters, etc., just like mathematical formulas of Sine...
- Encryption, unbreakable or extremely poor, always ends up being numerical to encode and reconstruct the values, and it could use as much logic and elements as a programming language and as any high complexity process in mathematics reciprocally.
Please tell me about something that can be stored and processed by a computer that in the end doesn't end up being converted as a set of numbers. I can't think of no device or file format that doesn't use numbers and high or low mathematics to yield the usable results as software and user data.