Let's take a complicated example for fun. Say we've
char a[10][20][30] = { };
Size of a will be sizeof(char) * 10 * 20 * 30; so sizeof(a) = 6000 (as per C99 standard). a can be seen as an array of (10) array of (20) array of (30) characters. Now
a[0] will be one dimension lesser giving us an array of (20) array of (30) characters
a[0][0] will give us an array of (30) characters
a[0][0][0] is a character.
In all these examples 0 is nothing but the first element of the respective array level.
Now finding the length of an array by doing sizeof(a)/sizeof(a[0]) is a trick rooted in the above logic. sizeof(a[0]) is nothing but the size of an array of 20 array of 30 characters, which is 600. sizeof(a) / sizeof(a[0]) = 6000/600 = 10, giving back the length of the first dimension. Similar math can be done for higher dimensions too.
Since in your question, you've a pointer type char* the sizeof(char*) should be taken as the base factor which would be multiplied with the lengths of each dimension. Size of a pointer type depends on the machine/architecture type and the compiler you use.
Each of us will have different machines and different compilers running on them, so we need a common reference for explanation. Running your program in an online compiler gives the result 40 / 8 = 5. like I've stated above, depending on the platform and compiler, the size of a pointer type will vary.
Like you've written in the comment, your array is of type char* [5][2]. Deferencing with [0] will remove one level and we've char* [2]; thus sizeof(var[0]) = sizeof(char* [2]) = 8, say that size of two char pointers is 8, which implies that sizeof(char*) is 4 on that online machine. On this basis sizeof(var) = 4 * 5 * 2 = 40, which is what we see in the output, thereby rightly giving the first array length as 5.
Now your output, like mentioned by glglgl, is a bit different (perhaps your machine or the compiler's data model is 16-bit); the machine you're running it on in combination with your compiler seems to give 2 as the size of a char pointer i.e sizeof(char*) = 2. Now when you do var[0], we've char* [2]; its size equals sizeof(char*) * 2 = 2 * 2 = 4. Likewise sizeof(var) = 2 * 5 * 2 = 20. Thus you've 20 / 4 = 5 as output.
How could I know how many bits do I have per element of the 1st dimension, I mean through calculation?
In char* [5][2] each element of the first dimension is the type char* [2], thus it's size is 2 * sizeof(char*).
Hope it helps!