One Line in /etc/hosts Held My Chrome Hostage for Two Years

Chrome hosts debugging

Something very "AI era" happened today.

My Chrome had been broken for two years.

The symptom was bizarre: Type a keyword in the address bar, Google Search would spin forever. Later it started saying "this site cannot be reached." But typing a URL directly? That worked fine.

For two years, I did every standard thing an IT person would do:

Reinstall Chrome. Upgrade Chrome. Delete Profile. Check extensions. Check DNS. Check proxy settings. Check search engine config. Even suspected Google itself was glitching.

Nothing helped.

Then today, I asked my Hermes agent Tuya to look into it.

Tuya didn't stop at the FAQ-level "try reinstalling." It started digging like a battle-hardened sysadmin, layer by layer:

Chrome configuration. SQLite database. Preferences. System layer. hosts file.

And finally unearthed this:

A two-year-old zombie config sitting in my /etc/hosts:

31.13.72.23 www.google.com

That IP? It belongs to Facebook.

Which means:

For two whole years, every time I typed a search query in Chrome's address bar, I was essentially saying:

"Take my Google request and hand it to Facebook."

Facebook, of course, was baffled: "Who the hell are you?"

And timed out.

The truly absurd part?

Updating Chrome could never fix this. Because /etc/hosts is a macOS system file. Chrome never touches it.

It's like:

Someone secretly changed your house number to your neighbor's address, and you kept ordering furniture that could never find its way home.

But here's the deeper thing:

The scariest part of this kind of problem isn't complexity.

It's that you'd never think to look there.

Normal people check the browser. Check extensions. Check the network. Check DNS.

Who would think: "Chrome won't search" has anything to do with a Facebook IP hidden in /etc/hosts?

A lot of real-world problems work exactly like this.

What really tortures you isn't the "major outage."

It's some tiny config someone left behind two years ago. A patch nobody remembers. A "temporary fix." A rule nobody reads anymore.

It lies there quietly, like a corpse.

Until one day, the whole system starts slowly poisoning itself.

And everyone keeps debugging on the wrong layer.

This is actually what makes AI agents interesting.

They're not necessarily smarter than humans.

But sometimes they're less biased.

Human experience can be so strong it becomes a cage.

"Chrome broken" → must be Chrome. "Network issue" → must check DNS. "Search not working" → must reinstall the browser.

But an agent doesn't care about saving face. Doesn't care about industry common sense.

It just digs down, layer by layer.

And sometimes, it digs up a corpse.

Two-year zombie config. Laid to rest today.

发布者

立委

立委博士,多模态大模型应用咨询师。出门问问大模型团队前工程副总裁,聚焦大模型及其AIGC应用。Netbase前首席科学家10年,期间指挥研发了18种语言的理解和应用系统,鲁棒、线速,scale up to 社会媒体大数据,语义落地到舆情挖掘产品,成为美国NLP工业落地的领跑者。Cymfony前研发副总八年,曾荣获第一届问答系统第一名(TREC-8 QA Track),并赢得17个小企业创新研究的信息抽取项目(PI for 17 SBIRs)。

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注

这个站点使用 Akismet 来减少垃圾评论。了解你的评论数据如何被处理