While I don't disagree with your concerns about our freedoms, the Internet is not doing OK. It was based on technology of the 60s, an implemented with hardware of the 70s. In those days the computer memories, computer speed and communication speeds were all very primitive, slow, and memory was hard to get in any decent size. 8k of memory was built on torroid magnetic cores with actual wires connecting memory cells. It would be as tall as a fridge, and the width of its fridge door.
Even the first IBM personal computer was more powerful than an IBM 360-50. DOS had a really small internal memory, and the disk drive was 10MB. Modem speeds were very slow less than 128Kbs.
Compare it with today, Ghz, TB, and internal memory of GB. Communication speeds for Cable internet downloads can range from 3-50 MBps (Megabytes per second) or about 60 to 1000 times faster than a 56K dial-up connection. Some cable providers are now offering speeds of up to 150 MBps
My point the TCP IP roads were built for those old computers and comm speeds, but these roads are still being used for the computer systems and speeds of today.
Internet security has to be designed in and not patched decade after decade. It is like DOS was the backdoor into every MS Windows OS, so backward compatibility allowed the smart to be able to hack both the computer and the TCP IP. comm roadways.
The encryption coding we have today would have never been unsecure in the 70s, but today we have to put more and more bits greater than 256 to make it secure. But with distributive cpu cores, it is just a matter of adding more and more cpus.
So the real issue should be, why aren't we building a new web that is based on privacy. Many of the hacks over the years were because memory was small, and computers were slow, very few checks were made in the computer sw and fw. So, telling the sw to fill a table that held only 256 locations with a size much bigger would corrupt the computer because anything location beyond 256 was used by some other function or program.
The NSA is capturing data on the Internet by installing nodes that copy data and move it on again. There are no real checks on the Internet.