I'm a new programmer to C# and new on this website and couldn't find a similar post on the subject, so sorry if it was already covered.
As part of a video course, I was instructed to create a console "APP" that calculates an average of students' scores. the user inputs the total amount of students, then each individual score, and then the sum is divided by the number of students.
what I can't understand is why C# adds another extra number to the students? so the user enters 5 and I get to put a score 6 times...
class Program
{
static void Main(string[] args)
{
int totalScores= 0;
int totalStudents = 0;
int number;
Console.WriteLine("how many students are there?");
int studentNumber = int.Parse(Console.ReadLine());
while(totalStudents <= studentNumber)
{
Console.WriteLine("enter your student score");
string studentScore = Console.ReadLine();
if (int.TryParse(studentScore, out number))
{
totalScores += number;
totalStudents += 1;
}
else
{
Console.WriteLine("you did not enter a number");
}
}
int avarage = totalScores / totalStudents;
Console.WriteLine("the average is" + average);
}
}
This is what I see in the console:
console
I'm sure I'm missing something... any advice?
question from:
https://stackoverflow.com/questions/65938227/c-sharp-calculating-an-average-adds-an-extra-number 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…