ChatGPT became a search engine.

Denis Tsyplakov
2 min readNov 1, 2024

--

OK let’s see how it works with info from the Internet. Let’s see how it works

Wow! It works, it can search for latest data and even provide link to the source.

Let’s do something more advanced. Can Chat GPT use data from the internet

Wow again! It just works.

And what about real use case. Specifically with coding tasks.

Below is sample Angular 12 code with ngIf. rewrite it with new Angular 17 if syntax


```html
<div *ngIf="isLoggedIn; else loggedOut">
<p>Welcome back, user!</p>
<button (click)="toggleLogin()">Logout</button>
</div>
<ng-template #loggedOut>
<p>Please log in to continue.</p>
<button (click)="toggleLogin()">Login</button>
</ng-template>
```
Answer from 4o

<div *if="isLoggedIn; then loggedIn; else loggedOut"></div>

<ng-template #loggedIn>
<p>Welcome back, user!</p>
<button (click)="toggleLogin()">Logout</button>
</ng-template>

<ng-template #loggedOut>
<p>Please log in to continue.</p>
<button (click)="toggleLogin()">Login</button>
</ng-template>

Let’s use o1-preview. It thinks for 10 seconds, .. digesting Angular 17 syntax..

Hm! Good sign!

Aaaaand!

<if [condition]="isLoggedIn">
<then>
<p>Welcome back, user!</p>
<button (click)="toggleLogin()">Logout</button>
</then>
<else>
<p>Please log in to continue.</p>
<button (click)="toggleLogin()">Login</button>
</else>
</if>

Well… Just in case real answer should be somethinh like this

@if(isLoggedIn){
<p>Welcome back, user!</p>
<button (click)="toggleLogin()">Logout</button>

} @else {
<p>Please log in to continue.</p>
<button (click)="toggleLogin()">Login</button>
}

Let’s try to give a hint

And it works, finally.

But without explicit prompt result was incorrect. For me this means that unless you are doing something simple, you cannot trust gpt-search results if you can not post-verify them.

Be careful.

PS: BTW ability to do search technically means that LLM can “escape” i.e. it can hit web URL -> can do REST call -> can do remote code execution and other fun stuff. One step closer. From LLM capabilities standpoint, one step left. Current LLM cannot do calls on it’s own. Action should be initiated by user. But once we have “AI agents that can do background tasks for us” we can face some interesting consequences.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Denis Tsyplakov
Denis Tsyplakov

No responses yet