I've run into an odd problem with this code:
legibIndex = 206.385 - 84.6 * (countSylb / countWord) - 1.015 * (countWord / countSent);
This is the calculation for the legibility index of a given text file. Since this is a homework assignment, we were told what the Index should be (80, or exactly 80.3)
My syllable count, word count, and sentence count are all correct (they match up with the given numbers for the sample textfiles.
Even if I hardcode the numbers in, I do not get 80, even though I do when i put it into my caclulator exactly as seen. I can't imagine what is wrong.
Here is the equation we were given:
Index = 206.835 - 84.6 * (# syllables/# words) - 1.015 * (# words/# sentences)
As you can see, I just plugged in my variables (which are holding the correct values. For reference, the values are : 55 Syllables, 40 Words, 4 Sentences, as given by the instructor. The values my program produces when ran is a Legibility Index of 112.
Am I missing some brackets, or what? I'm stumped!
Right off the bat, from the names (which include the word count) I'd guess that countSylb
, countSent
and countWord
are declared as integers, and therefore your divisions are doing integer arithmetic, truncating the decimal portions. Cast them to floats and that should fix it.
legibIndex = 206.385 - 84.6 * ((float)countSylb / ((float)countWord) -
1.015 * (((float)countWord / ((float)countSent);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With