The tradition of using i as a "meaningful" integer index started with the implicit variable naming technique employed in old Fortran 77 compilers. Fortran 77 implicitly casted any variable name starting with any letter between i through n as an integer. Variables that started with other letters were not cast as integers. Implicit type casting is bad coding practice.
(Don't use n for integers for the sole reason that Fortran 77 uses it. Use n because your math book uses it.)

You still have to explicitly declare variables that start with i-n (or any other letter) as an integer before using it in languages like C/C++. That's explicit type casting and C/C++ enforces strong type casting, so it's good coding practice.