Abstract

Textbooks and courses on numerical algorithms contain numerous examples which lead students to believe that the algorithm of choice for computing the zeros of a function f1994 is Newton's algorithm. In many of these courses little or no time is spent in providing students with "real world" experiences where Newton's method fails. The work presented in this paper describes a slow convergence problem encountered while trying to use Newton to estimate values for the 2 distributions. The problem occurred while the authors were trying to implement a well-known machine learning algorithm from the field of artificial intelligence. The function being evaluated and the convergence problem with Newton's method is described. Numerical results are given that indicate that a hybrid algorithm consisting of Newton and the nonderivative bisection algorithm not only provides good results but quickly and consistently converges. © 1994, ACM. All rights reserved.

Department(s)

Mathematics and Statistics

Second Department

Computer Science

International Standard Serial Number (ISSN)

0097-8418

Document Type

Article - Journal

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2024 Association for Computing Machinery, All rights reserved.

Publication Date

03 Dec 1994

Share

 
COinS