In order to free myself from C and its interfaces, I decided to design a new language.
Have you worked through all of the problematic areas of C and have you considered how these problems are dealt with in other languages?
... the compiler has to ensure, for example, that you are not dividing with zero,
You can't do that in all cases, which brings the question, how is the programmer supposed to deal with it? Should the compiler insert checks into the code when it can't prove impossibility of division by zero? If so, what should that code do when it detects division by zero? Should there be a special syntax (or other construct) to tell the compiler that the situation is impossible? Or should there be a special syntax (or other construct) to check for division by zero, which the compiler will always be able to understand? (See, if you can't always prove impossibility of division by zero, the check for zero may too be misunderstood by the compiler if it's overly complex, too far away from the division operator and so on).
or that the returned value of the expression at the right size of the "=" fits in the variable at the left size of the "=".
Via a mandatory cast (e.g. Java and Go) or by employing arbitrary precision arithmetic (e.g. Python)?
What about comparing signed and unsigned integers? Are you going to do it half-assed as in C/C++ and Go, requiring multiple checks and/or casts or are you going to restore the mathematical sense for once? This is a frequent problem, often with security implications.
My general intention is to make programmer errors harder. I am aware this may annoy programmers when trying to get used to it, but I am also aware it will reduce debugging time, since errors will be more rare.
Properly learning one's tools (saws and programming languages and school-grade math) can achieve that.
Consider now an out-of-bounds error, which involves using a variable as an index to access an element of a 12-element array. If the variable has the value 12 or greater, or is negative, it will definitely result in an error which, unlike the divide-by-zero error, may not even be evident at runtime. The compiler should be able to ensure the variable is in range in order to compile the code.
But it can't always do that. For example, in Java you can't have an arbitrary reference (pointer). It can only point at a live object or be null. When the compiler can't prove non-nullness, it has to check for it. One way is an explicit compare instruction. Another is for array elements (and object fields) that are not farther away than a page or a few pages from the beginning of the containing object, when the page 0 (an possibly a few more) can be left unmapped, causing page faults on accesses through null references. However, if you don't know the index bounds at compile time, you still need an explicit check against the element count at run time. Again, how do you detect the problem and how do you propose to deal with it by the programmer and the generated code?
There should also be as much as possible well-defined behaviour.
Agreed. C has a bit too much of undefined and unspecified behavior. Java got rid of some of the absolutely unnecessary ones.
It is evident that even experienced programmers put much time into writing code carefully in order not to invoke undefined behaviour.
I've seen many in Android source code. And one would think that Google's got the best programmers. Apparently, not.
Ditto for Microsoft (been there) and Amazon (had a chance to be interviewed by someone who did know about undefined behavior but still insisted on it being somehow predictable or possible to reason about, lol) and the rest of the world is no better.
I am thinking of having allowed ranges for variables. A variable representing a weekday would have a range equal to [0, 6] or [1, 7], depending on what you like. Trying to assign the value 8 to it would result in an error, since 8 is out of range.
Again, same questions, how do you detect it and what do you propose the programmer or the generated code do when the detection succeeds?
Booleans should not be built on top of integers like in C. Consider the "a + (b == c)" expression. Is there any real use for it?
I use it when not disallowed to. I don't think this is an example of a big or important problem, though. I should probably repeat the initial questions so you're not sidetracked into stuff of secondary or tertiary importance or into adding features...
Have you worked through all of the problematic areas of C and have you considered how these problems are dealt with in other languages? Take the language standard and go through it if you haven't yet. Write down the problems (some are conveniently grouped in the annex devoted to undefined behavior). Then read up on other languages.
Speaking of adding useful features, have you heard of C++ proposing array/string views/spans?