How come dividing two 32 bit int numbers as ( int / int ) returns to me 0, but if I use Decimal.Divide() I get the correct answer? I’m by no means a c# guy.
How come dividing two 32 bit int numbers as ( int / int )
Share
Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.
Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.
Lost your password? Please enter your email address. You will receive a link and will create a new password via email.
Please briefly explain why you feel this question should be reported.
Please briefly explain why you feel this answer should be reported.
Please briefly explain why you feel this user should be reported.
intis an integer type; dividing two ints performs an integer division, i.e. the fractional part is truncated since it can’t be stored in the result type (alsoint!).Decimal, by contrast, has got a fractional part. By invokingDecimal.Divide, yourintarguments get implicitly converted toDecimals.You can enforce non-integer division on
intarguments by explicitly casting at least one of the arguments to a floating-point type, e.g.: