Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I mean, part of the problem is that no one can seem to agree about what "AI" even is anymore; you described a control theory model that "calculate[s] the envelope" which I'm interpreting to mean some sort of confidence interval/posterior, but to me that's exactly what you should be doing whenever possible in an "AI" model, as in, control theory (it sounds like with Gaussian Processes or something) is the right "AI" approach to that problem.

To me, linear/logistic regression is AI, as are neural nets, as are pretty much any statistical model. These days "AI" seems to be roughly interpreted as "whatever a DS person deems as a solution to our business problem" at least in my very limited experience. It's all a bit silly since AI has come to mean so many different things that may not fit the traditional definition of AI from e.g. the 90's, but as long as people aren't pidgeon-holeing themselves into particular methods, I'm happy to go along with slapping an "AI" label on pretty much anything.



I took envelope to mean the operating conditions under which the control system is stable. For many control systems this isn't really a confidence interval except insofar as the parameters of the control system are imperfectly known.

For as long as I can remember, AI has been a synonym for "hard CS problem" or even just "hard math problem." It has more use as a marketing term than as a technical term with precise meaning.


"I took envelope to mean the operating conditions under which the control system is stable."

Yes, that's exactly what I meant. I kind of borrowed the term from aviation where this is used to describe conditions under which the behavior of the aircraft is known and predictable. Doesn't mean it is not possible to fly outside the envelope, you just don't get any guarantees.


> I took envelope to mean the operating conditions under which the control system is stable. For many control systems this isn't really a confidence interval except insofar as the parameters of the control system are imperfectly known.

Ah gotcha, totally agree.

> It has more use as a marketing term than as a technical term with precise meaning.

yea exactly; not a problem for me that this is the case (buzzwords are just a fact of life), but really what should be happening underneath any "AI" solution is finding and implementing the right solution to the problem. If the right solution is classical control theory or linear models, but the business leaders insist on deep learning because to them that's what AI means, that's where there's a problem.


I think the point GP was trying to make was that robot control model is likely a process control model, which is an area of math that's already well-explored and involves no AI. Single and multi-variable control systems[0] get into some advanced math but are not obviously mappable to machine learning models--they're much more like heavy calculus.

[0] https://www.controleng.com/articles/exploring-the-basic-conc...


I’m confident* that “AI” these days means DNN that takes n-dim input in n>2. Anything else is “contemporary“ or “conventional” maths or theories.

* Read “not at all”


That's right. ML and AI are interchangeable at this point and both mean basically what you said. Though not necessarily a DNN per se. But that's the idea.

Academically we mostly specify AGI to mean things more advanced or speculative.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: