Isaac
Lifelong Learner
- Local time
- Yesterday, 17:40
- Joined
- Mar 14, 2017
- Messages
- 10,109
And we are back to the question of what is the basis of the value of its output?Mostly I don't see that it comes up with new things. It offers nothing in the way of real creativity. Everything it produces consists of stuff that is already available in some form elsewhere.
What it seems to have down is combining that old stuff into novel strings (or pixels or beeps). That's the essence of language and, by extension, coding - a performative utterance of sorts. So it's good at coding and putting words together. Yet its record on producing new information is spotty as hell: specifically, it likes to just make up whatever sounds good whether it's true or not.
And that would make me hesitant to trust it in providing ROI-positive decorating advice. Can it suggest a color for the wall? Maybe. But does it have anything backing up that color choice or did 'rainforest tangerine' just make a nice-sounding string of words?
How can we know whether or not a big chunk of that basis for its value currently rests on whether people recognize, and therefore a firm and trust the output? It is definitely very good at outputting responses that make you think gee that sounds perfect and well-rounded.
Now I'm starting to put a little effort into recalling what if any is the average difference between what I usually find when I research a topic versus what it outputs when I ask it a question. And precisely what the meaning of that difference may or may not be.
These are all important questions to keep asking as artificial intelligence or anything that has that label continues to expand in usage.
Like other subjects, if asking the question gets a person in a lot of trouble, that tells you something. And the something that it tells you does not bode well for the outcome, with the exception of temporary revenue.