Playing around with various AI tools, as is common nowadays, I asked the tool to search the internet and summarise common questions, complaints and misconceptions about ojAlgo. Just thought it would be interesting to learn something about the weak spots, and maybe do something about it.

Below you’ll find the AI’s response. It listed a number of points in each category, and also provided a description, answer or comment on that point. I copied the points/descriptions verbatim. I have then added my own comments in addition to what the AI reported. My comments are labelled “apete“.

Common Questions

  1. How does ojAlgo compare to other libraries like Apache Commons Math or JAMA?
    • ojAlgo is generally faster and more feature-rich for optimization problems, but users often wonder about performance tradeoffs.
    • apete: JAMA is purely a linear algebra package – no optimisation features. It’s public domain and quite old. It was first released in the late 90s and hasn’t been updated in many years. ojAlgo contains elements of JAMA – parts of the code was copied, extended and improved. There is absolutely nothing JAMA can do that ojAlgo doesn’t do better.
    • apete: Apache Commons Math (ACM) contains a large variety of maths related code. The linear algebra and mathematical optimisation parts are what most naturally compares with ojAlgo. In both these areas ojAlgo is vastly superior. In particular the optimisation features of ACM are limited and not so capable. If you use ACM because it has some other feature ojAlgo does not, you should consider moving to Hipparchus. ACM hasn’t had any releases for many years. That’s because most of the developers/contributors left, forked the code base and started Hipparchus.
  2. Can ojAlgo solve large-scale optimization problems efficiently?
    • Yes, to an extent. It’s well-optimized for many problems, but very large problems may require specialized solvers like CPLEX or Gurobi.
    • apete: Depends on your definition of large-scale. If it means the “upper end” of what solvers like CPLEX and Gurobi can handle, then the answer is No. ojAlgo’s main selling point in this area is that it’s open source pure Java. As such there are very few alternatives. Compared to those alternatives ojAlgo is by far the most complete and capable, but that’s still a long way from the top commercial native code solvers.
  3. Does ojAlgo support integer programming (MILP)?
    • Limited support. Some features exist, but ojAlgo is not primarily designed for discrete optimization.
    • apete: ojAlgo definitely supports MIP, but this is an area where the difference to the top commercial native code solvers shows.
  4. How do I interface ojAlgo with other systems (e.g., Python, R)?
    • This often comes up due to Java’s ecosystem isolation; there’s no direct bridge like JPype for ojAlgo, so this requires a Java wrapper.
    • apete: I don’t use Python or R…

Common Complaints

  1. Sparse matrix support is limited.
    • While basic support exists, it’s not as fully developed as in libraries like SuiteSparse or SciPy.
    • apete: Yes, that’s true, but I’d like to emphasise that there is some support for sparse data structures. In particular much of the optimisation stuff depends on efficient sparse code. This https://www.ojalgo.org/2020/09/sparse-and-special-structure-matrices/ shows that there are sparse arrays and matrices and more importantly demonstrates how to work with special structured matrices in ojAlgo.
  2. Documentation can be dense.
    • The API docs are comprehensive, but newcomers often find it hard to locate practical usage examples.
    • apete: This I don’t understand or agree with. There are quite a lot of examples here https://www.ojalgo.org/code-examples/ and most of the examples are fully functional complete programs – not just code snippets.
  3. Steep learning curve for optimization models.
    • Defining constraints and objectives isn’t as intuitive as in some DSLs like Pyomo or CVXPY.
    • apete: The way the optimisation objective is defined is different from what I assume most users expect. I can see that some get it wrong until they read the documentation or look at an example. Given how an “important” part of ojAlgo the optimisation features are, I would have to admit that it’s under-represented in the documentation. https://www.ojalgo.org/2019/05/the-diet-problem/ is an example of how to model a simple classic optimisation problem. It demonstrates everything you need to know to model optimisation problems.
  4. Performance tuning is opaque.
    • The library is fast, but users have limited control over internal performance settings like multithreading.
    • apete: Yes, we try to be fast AND hide complexity to the user. There is a blog post detailing some of the controls: https://www.ojalgo.org/2019/08/controlling-concurrency/

Common Misconceptions

  1. “ojAlgo is just for finance.”
    • While it has strong finance capabilities, it’s a general-purpose linear algebra and optimization library.
    • apete: The original use case was finance related, but it’s just maths. There is no focus on any specific business or application domain.
  2. “It’s outdated or unmaintained.”
    • Not true. ojAlgo is actively maintained with regular updates and detailed documentation.
    • apete: It’s mature. Development started a long time ago, but never stopped. It has been continuously worked on for many years.
  3. “It requires native libraries.”
    • False. ojAlgo is pure Java, which makes it easy to integrate and deploy.
    • apete: Where does this come from? Being pure Java with zero dependencies has always been a key feature and selling point.