Java: Simple BigDecimal logical error

I've got a simple piece of code that isn't behaving as it should. This piece of code is attempting to add an array of BigDecimals to then divide by array.length to find an average. However, the first phase of the algorithm fails to add the arrays together correctly (in the variable "sum"). public BigDecimal getAverageHeight() { BigDecimal sum = new BigDecimal(0); BigDecimal[] heights = getAllHeights(); for (int a = 0; a < heights.length; a++) { sum.add(heights[a]); System.out.println("Height[" + a + "] = " + heights[a]); System.out.println("Sum = " + sum.setScale(2, BigDecimal.ROUND_HALF_UP)); } return sum.divide(new BigDecimal(heights.length)); } The output is as follows: Height[0] = 24 Sum = 0.00 Height[1] = 24 Sum = 0.00 Height[2] = 24 Sum = 0.00 Height[3] = 26 Sum = 0.00 Height[4] = 26 Sum = 0.00 Height[5] = 26 Sum = 0.00 I'm sure its a simple error, but I'm getting tired of starring at the problem, thanks in advance.
For future reference, your problem is solved by reading the Javadoc.

以上就是Java: Simple BigDecimal logical error的详细内容,更多请关注web前端其它相关文章!

赞(0) 打赏
未经允许不得转载:web前端首页 » JavaScript 答疑

评论 抢沙发

  • 昵称 (必填)
  • 邮箱 (必填)
  • 网址

前端开发相关广告投放 更专业 更精准

联系我们

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

微信扫一扫打赏