Apple reportedly considering Peplexity acquisition

That $14B purchase amount would be immediately recouped in the stock bump Wall St would reward Apple with for making such an acquisition.

True.

In a sense, it reminds me of the NeXT acquisition, in that in theory Apple should be able to get AI working, but like “preemptively multitasking modern OS,” they just can’t (couldn’t) seem to get it done on their own.
 
Last edited:
Whether it is worth 14b or not, its petty cash for apple and as above, they need to be doing something for the share price after the apple intelligence/siri fiasco.
 
Thinking some more...


They probably want to get onto it pretty rapidly too, because the shift from search to AI is happening and happening FAST; they're either going to be a significant player in AI or they're going to miss the boat like they did with search.

I barely use a search engine directly any more, I've pretty much defaulted to ChatGPT as a web search engine and deep research for anything I need to deep dive into (instead of me spending 2-3 hours googling stuff, chatGPT goes down all those rabbit warrens and cites, filters crap out for me with a full list/chain of thought).

No ads, no 1-2 pages of sponsored crap links to sift through, etc. Just pre-vetted URLs pulled out of various repositories/search engines/pre-indexed content.

I haven't played with Gemini much yet, but apparently it is on par maybe better.

I know some people hate AI chatbots, and at some things they leave a lot to be desired - but search... wow. It's like going back in time 25 years to when search engines were actually useful, except that the AI can even go read it for you and summarise with full citations.



Buying perplexity would be a great move, and its not like they're short of cash, they literally don't know what to do with it.
 
No ads, no 1-2 pages of sponsored crap links to sift through, etc. Just pre-vetted URLs pulled out of various repositories/search engines/pre-indexed content.

That is absolutely coming once Chat GPT and other AI companies need to become profitable. It’s reckoned that even GPT’s $20 tier loses money right now. Think early streaming/Uber/etc … and everyone saying how cheap they are never mind better than their classical counterparts. Same thing. This is the introductory phase.
 
Sure, but right now I'm a pretty happy paid customer. If they enshittify the paid service I'll go elsewhere. Probably, by that point to on-device local LLM.

Right now: the choice is between massively enshitified search and something that just plain works better.

I think at this point I'd be willing to pay $50 for ChatGPT plus, easily due to the time it saves me dealing with search that just plain sucks. Purely because of what it can do that LLMs are actually good at - reading a bunch of web content and summarising the findings, instead of me wasting time doing it.

That's something I'm pretty sure could be done on device.

That is absolutely coming once Chat GPT and other AI companies need to become profitable. It’s reckoned that even GPT’s $20 tier loses money right now. Think early streaming/Uber/etc … and everyone saying how cheap they are never mind better than their classical counterparts. Same thing. This is the introductory phase.
 
Last edited:
Sure, but right now I'm a pretty happy paid customer. If they enshittify the paid service I'll go elsewhere. Probably, by that point to on-device local LLM.

Right now: the choice is between massively enshitified search and something that just plain works better.

I think at this point I'd be willing to pay $50 for ChatGPT plus, easily due to the time it saves me dealing with search that just plain sucks. Purely because of what it can do that LLMs are actually good at - reading a bunch of web content and summarising the findings, instead of me wasting time doing it.

That's something I'm pretty sure could be done on device.
One wonders if search hadn’t been enshittified in the first place if we’d be having this discussion - if the value added would be enough to justify the extra computational cost. Some of the enshittification would have happened anyway - things like SEO, gaming the rankings would have always been a constant battle. But others … these llms simultaneously feel like overkill but also not good enough (still hallucinating to often where people are handing in “research” the llms assure them is real and has links etc …). It’s also not clear if they’ll get worse or better over time (ouroboros as the web fills up with AI generated content that new AI is trained on).
 
Well below turned out longer than I expected but here goes….

Honestly search being worse than AI is more than provider enshittification.

That’s part of it but as more and more stuff is uploaded to the internet that matches a query it’s a difficult thing to filter out without some sort of understanding of the content.

New isn’t always better. Freshness is a metric used by many engines and it’s shit.

Comprehension of the body of the content is a far better metric and LLMs that can parse content will win against dumb metrics like freshness or how good someone is at SEO.


I can do natural language queries, have the ai go off and find stuff, interpret the results and summarise in a usable form.

Like it or not that is a massive win for Gemini, ChatGPT and the like. They will cite sources that you can validate against.

I held off using them due to the hype and lack of confidence in what they can do but they’re like any other tool: hammers are good at driving nails but don’t work as screwdrivers and vice versa.

If you aren’t keeping an eye on the progress in this space at this point you’re really going to be left behind.

Avoiding hallucination is a thing and there are ways: choosing an appropriate model, prompting it with content to index etc.

Yes. Treating output as fact without any access to data specific to the field of query is a dead headed way to use them.

Reasoning models aren’t necessarily better if you want good results, depends what you are after: creativity vs indexes into real content.

I think that Apple need a boost in this space, by the same token I think their approach of small models processing specific data on device (not reasoning too much and not leaning too much on creativity, more assistance processing actual data you already have, eg contents of mail, calendar, documents etc.) is going to be a huge thing in the near future.

It’s what I’ve wanted from Siri for years. Not a dumb voice processor and “I can show you web results on your iPhone”.

More an ai assistant to monitor data being sent to me (email, calls, files, etc.) or generated by me and pull relevant details and help keep me on top of my responsibilities processing said inputs.

Things like “hey, bob sent you an email about a project last week that needs to be completed by the end of the month, it sounds important, so I’ve blocked out an hour of your time tomorrow to review it”.

Or

“While you were in work focus, you had 15 missed calls, 3 of which were important. I’ve scheduled time for you to follow up at 2pm”

Basically I see on device AI as being a personal assistant for people who aren’t c suite.
 
Last edited:
Just as an example of how it helped me today:

Azure virtual server pricing.

I’ll link it here.

This info would have taken me hours to figure out via the unmitigated disaster that is Microsoft azure costing.

Took seconds and is accurate (I checked)


IMG_2648.png



IMG_2649.png


It carries on citing links, etc to the calculators. But in like 30 seconds it had usable info for me, formatted in a table with various options etc.
 
Back
Top