I saw an article fretting about taxpayer-funded broadband projects in Texas Monitor. It cites a “study” by the Taxpayer Protection Alliance Foundation that purports to show a wide swath of “failed taxpayer-funded networks”.
A little research on the site led me to realize that it’s not first-rate work – outdated, incorrect information – so I left the following comment on the Texas Monitor site:
I decided to check the “Broadband Boondoggles” site to see what information they provide. First off, the copyright date on the site’s footer says 2017 – are they even updating it?
More specifically, I found that they disparage the local ECFiber.net project (in VT) of which I have personal knowledge. They state that as of January 2015 ECFiber has spent $9M to connect 1,200 subscribers (“an astounding $7,500 per customer.”)
Well, that may be true – as of that date. If they had bothered to follow up with ECFiber’s progress (https://www.ecfiber.net/history/) they would have learned:
- As of January 2018 they have connected over 2000 customers (cost per subscriber is now roughly half that reported number)
- They’re hampered by the pole “make ready” process by the incumbent monopoly carriers who are slow to respond. They could connect subscribers faster if the carriers would follow their legal make-ready obligations.
- ECFiber is a private community effort, entirely funded with grants and private equity/loans, so I’m curious how they could even have filed a FOIA request.
- They’ve now raised $23M capital (from the private markets), to reach 20,000 subscribers.
- This gives a system-wide average cost of $1,150/subscriber – a very attractive cost.
I’m sure there are false starts and overruns for many municipal projects, but if this outdated information is typical of the remainder of the TPAF site, then I would be reluctant to accept any of its conclusions without doing my own research.
I’ll be speaking next month at the WordPress Meetup about the using Docker to host a development WP server on your laptop. Here’s the writeup:
Docker for WordPress
Docker enables developers to easily pack, ship, and run any application (including WordPress) as a lightweight, self-sufficient container which can run virtually anywhere.
For WordPress users, this means it’s easy to set up a lightweight development WP server on your laptop/desktop. You can make and test changes before migrating it to your client’s site. Best of all, if you screw things up, you can simply discard the container’s files and start afresh in a couple minutes. And because it’s running on your local computer, no need to worry about hosting, configuring servers, etc.
Rich will show how to install the Docker application on a laptop, then install and start a WordPress Docker container. The result is the familiar new WP install that you can customize to your heart’s (or client’s) content.
The WordPress Meetup is open to all on Tuesday, 8 May. Sign up at https://www.meetup.com/WordPressDevNH/events/249032144/
I went to a terrific talk at the Lyme Library earlier this week.
Randall Mikkelsen from Reuters spoke on the topic, “Fake News: What’s the Real Story?”. In it, he presented The Chart which is an analysis of popular web sites showing their bias (left, center, right) with a measure of their reliability/believability. It’s useful to check your reading habits to see if they match your expectations.
That site also has Six Flags to Identify a Conspiracy Theory Article. This is an easy way to check your reading matter to see if it’s “actual news” or just somebody writing to get you fired up. (I also included a comment – what do you think?)
So you’ve just learned something cool on a new subject, and you want to let the world know about your discovery. You go to the project’s wiki, and jot it all down. But how can you help people read what you’ve written?
When I look at pages on a wiki, I use three criteria to determine whether I want to spend the time to read a page. If I’m convinced that the page has the info I’m seeking, I’ll work hard to understand it. But if I can’t tell whether it’s any good, it’s just faster to post a query to the forum. Here are the questions I ask:
- Is this page for me? Does it apply to my situation?
There are a lot of cues to whether a page “is for me”. Obviously the title/heading of the page is important. But when I’m seeking information, I’m not usually an expert in the subject. I need help to understand the topic, and I look for a description that tells what the page is about. I also look for cues to see if it’s up to date. Finally, I love a page that has an introductory section that talks about the kinds of info that I’ll find on the page.
Does the author know more than I do?
A number of factors influence this judgement. As you’re aware, there’s a huge range of knowledge level of wiki page authors – from expert to the newcomer who’s excited to document his first discovery. As I scan through a page, I’m looking for facts that confirm what I already know (proving the author has some skill), and then things that I don’t (showing they know more.) Finally, it helps to know that the author is aware of the conventions of the wiki – does it look like other wiki pages? If so, I get some comfort that the author is aware of the way other wiki pages work/look.
Can I figure out what to do?
My final question about whether a page is useful is whether I can use the information. If it’s a tutorial/howto, I want the steps clearly stated – “step 1, step 2, step 3, then you’re done” If it’s a reference page, is the information organized in a comprehensible fashion? Is it really long? Can I pick out what’s important from incidental info?
The challenge I put to every author is to organize the information in a way that presents the most frequently-sought info first, then figure out what to do with the rest. You might move sections around, or move some information onto its own separate page, coalesce it into an existing/similar wiki pages, or even create forum articles (instead of a wiki page) if the subject is rapidly evolving.
I just sent an email to the reporter from NewsPressNow who posted a typical net neutrality story. A flaw in this kind of reporting is the tacit acceptance of an ISP’s blandishments that the Internet was fine before the 2015 FCC rules, and that “… And I don’t know if you’d find anyone who said there was a problem with the internet.”
Well, someone said there was a problem, because Comcast paid a $16 million fine to settle a law suit for blocking/throttling legal internet traffic, exactly the kind of behavior that would be permitted by the change of rules. As I said in my note to the reporter:
I don’t know whether he [the source at the ISP] is ignorant of history, or simply baldly saying things that are known to be false, but a quick google of “Comcast throttle bittorrent” will turn up copious evidence that some ISPs were throttling the internet in those “good old days”. See, for example, these two articles that offer technical details of the Comcast case:
Wired: https://www.wired.com/2007/11/comcast-sued-ov/ and
This behavior by Comcast is the best documented, but I believe more research turn up more ISPs who dabbled in various kinds of throttling behaviors before the Title II language went into effect.
I encouraged the reporter to update the story with a reaction to this information from his source at the ISP.
Update – November 2017: Added descriptions for the other tools I had investigated.
Now that LEDE Project has an official release, I hungered for a way to see what kinds of traffic is going through my network. I wanted to answer the question, “who’s hogging the bandwidth?” To do that, I needed a Netflow Collector.
A Netflow Collector is a program that collects flow records from routers to show the kinds and volumes of traffic that passed through the router. The collector adds those flow records into its internal database, and lets you search/display the data. (You also need to configure your router to send (“export”) flow records to the collector. My experiments all employ the softflowd Netflow Exporter. It is a standard package you can install into your LEDE router.)
In an earlier life, I used a slick commercial Netflow monitoring program. But it wasn’t free, so it isn’t something that I can recommend to people for their home networks.
There are many open-source Netflow collectors which have varying degrees of ease of installation/ease of use/features. Most have install scripts that show the steps required to install it on an Ubuntu or CentOS machine, but they are fussy, and require that you have a freestanding computer (or VM) to run it.
Consequently, I created Docker containers that have all the essential packages/modules pre-configured. This means that you can simply install the Docker container, then launch it on a computer that’s continually operating, and let it monitor the data.
This is the first of a series of postings about Netflow Collectors. They include:
- Webview Netflow Reporter Netflow collector and web-based display program. Makes it easy to see fine-grained information about traffic. More…
- NFSEN/NFDUMP Netflow collector and web-based display program. Provides attractive graphs, and automatically detects Netflow exporters (so you can skip one configuration step.) More…
- FlowViewer Another Netflow Collector with web-based GUI. I created a Docker Container for FlowViewer
- DDWarden This claims to work with DD-WRT’s rflow protocol (very similar to Netflow v5). No further investigation because I was interested in something to work with LEDE/OpenWrt.
- Generating Netflow Datagrams A few ways to generate Netflow data:
softflowd to run on LEDE/OpenWrt routers and nflow-generator to send mock data in the absence of real traffic.
The Battle for the Net site https://www.battleforthenet.com/ no longer seems to have the telephone form(!)
But… Boing Boing does. Go to https://boingboing.net/. You’ll see a popup window with a place to enter your phone number. Click OK, and they pop up a script on-screen.
They call you, you answer, then you supply your zip code.
Then they place calls to each of your legislators (in the House and Senate), then if you have time, they call the offices of Mitch McConnell, Chuck Schumer, and other leaders, so you can deliver the message.
I say my name, home town, and then ask that the FCC preserve the current Title II Net Neutrality rules. The staffer who answers is gonna be busy – you might chat them up though to see if they’re getting slammed. (Mitch McConnell’s office wasn’t even answering(!))
Although I usually agree with him, one of my favorite bloggers, Dave Winer, recently said this:
One of the ideas circulating is that your ISP has a monopoly, owns the only way for you to get to the Internet, but that’s an old idea, it’s no longer true. Where I live the wireless vendors are just as fast as the wired ISP. The cost is still prohibitive, I still need wifi, but given an economic incentive to replace Comcast and Spectrum et al, some wireless vendor is going to step in, probably the smaller ones who aren’t yet owned by one of the big ISPs. Google could buy Sprint for example, and provide a route-around.
I wish I had the same competitive landscape that Dave enjoys. I wish this were true for the rest of the country. But the FCC’s own report from June 2016 (see page 8) shows that 58% of the country’s census blocks have 0 or 1 provider of 25/3 Mbps internet service. This seems a lot like a monopoly.
Let me tell you about the facts on the ground in my town of 1700 people in rural New Hampshire. My conversations with others in the region indicate these conditions hold in huge numbers of communities throughout much of New Hampshire, Vermont, and Maine.
- The best internet service in town is from Fairpoint. It’s possible to get DSL service to any home, but it’s still just DSL (and often very slow): they’re the only game in town.
- There is a wireless ISP, but the hilly terrain means their service is OK (10/1 Mbps) if you can get it, but only selected areas can be served.
- What about cable? Comcast finagled their claim to serve the entire zip code by providing service to one cluster of homes on the southern town border. They refuse to provide cable/internet service to the town center, let alone any place a mile away from there.
- And cell service? There’s only one bar in the center of town. You can’t make a phone call, so you sure couldn’t use the cell service for data.
So our incumbent ISP (Fairpoint) has a de facto monopoly position, with no alternatives in sight.
I wish that we could rely on the entrepreneurial impetus to sweep away bad, monopolistic ISPs. But we can’t – at least not in any reasonable time frame. The incumbents have rigged the system. NH law (instituted at the behest of the incumbent providers) prevents towns and cities from bonding to create their own municipal networks.
Back to the initial point: The FCC is making rules that seem to assume that we can “just switch carriers” if we don’t like their offering. Yet they fail to provide evidence that any such competitive service exists.
I say, leave the Net Neutrality rules alone until there’s a far better competitive landscape that would allow me to shop around for an ISP that provides options I might care for.
Hat tip to Ro Khanna (@RoKhanna on Twitter) for this…
A Portugal ISP (with no net neutrality constraints) appears to be charging 4.99€ (about US$5.86) per month for access to social media. And another 4.99€ for streaming video (Youtube, Netflix, etc). Oh, and another 4.99€ for streaming music. And additional charges for other kinds of network traffic. Here’s a link to their web page. which I ran through Google Translate to make it easier to read.
The FCC has proposed to end the rules that prevent ISPs from slicing and dicing up your access to the entire internet.
The FCC rules (released this week) are scheduled to be voted into effect on 14 Dec 2017.
This will be really bad for consumers. But it’ll be worse for entrepreneurs who’re not big companies (yet), and could easily be left “below any horizon”, and simply not visible to general customers.
What can I do?
John Oliver’s TV shows generated over 22 million comments on the FCC site, but they chose to disregard the public’s sentiment.
However, the Congress can tell the FCC not to issue these rules. But they need to know that people really care. The easiest way to make your voice heard is to call Congress directly. It sounds like a hassle, but it really isn’t…
The folks at Battle for the Net make it super easy. Give them your phone number, then they dial up your congressperson’s office, then ring your phone. They even give you a script to tell the staffer (who’ll probably answer the phone) and you tell them what you’re thinking. A 30-second call would be enough to let them know your thoughts.
DRY – Don’t Repeat Yourself – is it relevant for documentation? I recently saw this comment on a forum…
I’m not sure how useful it is to remove duplication [from the documentation pages]. It’s not code…
IMHO, duplication in documentation is a couple orders of magnitude worse than duplication in code (and duplication in code is bad) because bad documentation has the power to waste more people’s time.
With code, a single (knowledgeable) developer must take the time to read through the duplicated code to look for subtle differences.
But with documentation, every reader – perhaps hundreds of far less knowledgeable people – must mentally diff the two pages looking for common threads/important items/gotcha’s to try to be sure that they will succeed.
For example, these two documentation pages describe [some procedure…], and each describes an substantially different procedure. I often find the differing explanations so difficult to reconcile that I simply give up (or maybe resolve to come back some day), rather than bricking my router/leaving it inoperable/etc.
So, for common tasks, I believe it is always better to have a single well-curated page that correctly and concisely describes the procedure, instead of having multiple people write their own incomplete, or marginally correct procedure.