Tarus Balog : 2022 Open Source Summit – Day 4

June 25, 2022 07:04 PM

I always feel a little sad on the last day of any conference, and Open Source Summit was no different. It seems like the week went by too fast.

With the Sponsor Showcase closing on Thursday, attendance at the Friday keynotes was light, but those of us that showed up got to hear some pretty cool presentations.

Picture of Rachel Rose on stage

The first one was from Rachel Rose, who supervises R&D at Industrial Light and Magic. As a fanboy of ILM I was very eager to hear what she had to say, and she didn’t disappoint. (sorry about the unflattering picture but I took three and they were all bad)

In the past a lot of special effects that combine computer generated imagery (CGI) and live action are created separately. The live action actors perform in front of a green screen and the CGI backgrounds are added later. Technology has advanced to the point that the cutting edge now involves live action sets that are surrounded by an enormous, curved LED screens, and the backgrounds are projected as the actors perform.

This presents a number of challenges as the backgrounds may need to change as the camera moves, but it provides a much better experience for the actors and the audience.

The tie-in to open source is that a lot of the libraries used the creation of these effects are now open. In fact, the Academy of Motion Picture Arts and Sciences (the people responsible for the Oscars) along with the Linux Foundation have sponsored the Academy Software Foundation (ASWF) to act as a steward for the “content creation industry’s open source software base”. The projects under the ASWF fall into one of two tiers: Adopted and Incubation. Currently there are four projects that are mature enough to be adopted and several more in the incubation stage.

A lot of this was so specific to the industry that it went over my head, but I could understand the OpenEXR project, which provides a reference implementation of the EXR file format for storing high quality images.

A slide showing the ILM Stagecraft volume setup

She then went on to talk about Stagecraft, which is the name of the ILM platform for producing content. I would love to be able to visit one day. It would be so cool to see a feature being made with the CGI, sets and actors all integrated.

Picture of Vini Jaiswal on stage

The next speaker was Vini Jaiswal, Developer Advocate for Databricks. I had seen a cool Databricks presentation back on Day 2 and the first part was similar, but Jaiswal skipped the in-depth technical details and focused more on features and adoption. A rather large number of companies are using the Delta Lake technology as a way to apply business intelligence to data lakes, and as the need to analyze normally unstructured data becomes more important, I expect to see even more organizations adopt it.

The third presentation was a video by Dmitry Vinnik of Meta on measuring open source project health.

Begin rant.

To be honest I was a little unhappy to see a video as a keynote. It was the only one for the entire week and I have to admit I kind of tuned it out. It wasn’t even novel, as he has given it at least twice before. The video we were shown is available on Youtube from a conference earlier in the month and he posted another one dated June 24th from the Python Web Conference (while it has a different splash screen it looks to be the same presentation).

A still picture of a part of the video sent in by Demetri Vinnik

Look, I’ve given the same talk multiple times at different conferences, so I get it. But to me keynotes are special and should be unique. I was insulted that I bothered to show up in person, wear a mask, get my temperature checked each day, and I expected something better than a video I could have watched at home.

Note: Rachel Rose played a video as part of her presentation and that’s totally cool, as she didn’t “phone in” the rest of it.

Okay, end rant.

The next two presenters were very inspiring young people, and it was nice to have them included as part of the program.

Picture of Alena Analeigh on stage

The first speaker was Alena Analeigh, an amazing young woman who, among other achievements, has been accepted to medical school at age 13 (note that in trying to find a reference for that I came up blank, except for her twitter bio, so if you have one please let me know and I can update this post).

Med school is just one of her achievements. She also founded The Brown STEM Girls as an organization to get more women of color interested in science, technology, engineering and math. She stated that while men make up 52% of the workforce, they represent 76% of people employed in STEM fields.

My love of such things was fostered at an early age, and programs like hers are a great step to encourage young women of color to get interested in and eventually pursue careers in STEM.

While she seemed a little nervous and tentative while presenting, the final speaker of the morning was the exact opposite. At 11 years old, I could listen to Orion Jean speak for hours.

Picture of Orion Jean on stage

Orion also has a number of accolades, including Time Magazine’s “Kid of the Year“. He got his start as the winner of a speech contest sponsored by Think Kindness, and since then has started the Race to Kindness (“a race where everybody wins”) to spread kindness around the world.

To help inspire acts of kindness he uses the acronym K.I.N.D.:

  • Keep Your Eyes Open: Look for opportunities to be kind to others. One example he used is one I actually practice. If you are in line to check out at the store, and you see a person with a lot less items than you, while not offer to let them check out first?
  • Include Others: No one can effect change alone. Get others involved.
  • Nothing Is Too Small: One thing that keeps us from spreading kindness is that we can try to think too big. Even small acts of kindness can have a huge impact.
  • Do Something About It: Take action. Nothing can change if we do nothing.

After the keynotes I had to focus on some work stuff that I had let languish for the week, so I didn’t make it to any of the presentations, but overall I was happy with my first conference in three years.

There were a few people that attended who tested positive for COVID, so I plan to take some precautions when I get home and hope that the steps the Linux Foundation took to mitigate infection worked. So far I’ve tested negative twice, and I’ll probably take another test on Monday.

My next conference will be SCaLE in Los Angeles at the end of July, and I plan to be in Dublin, Ireland for Open Source Summit – Europe. If you are comfortable getting out and about I hope to see you there.

Mark Turner : Who is paying to coddle the racists?

June 25, 2022 05:18 PM

Facebook’s algorithms seem to have pegged me as a conservative, which I find amusing but also useful. I get to view ads that have absolutely no relevance to me yet prove to be an insightful look at what kind of red meat right-wing organizations are feeding their gullible followers. Yesterday I saw a provocative Facebook ad that was made to rile up the fearful. The group “Color Us United” is holding a talk next week entitled “How Wake County Is Turning Into Woke County.”

This got me looking into the organization, Color Us United. Color Us United appears to be a Morrisville-based non-profit run by Kenneth Xu, a 24-year-old who it seems makes a living stirring up racial animosity under the guise of condemning it. He’s been profiled in a few of North Carolina right-wing blogs as well as Hill.TV, and his narrative seems to be that racists should not be called out on their racism.

Take a look their catalog of ads Color Us United is currently running or has run on Facebook:

I was interested in knowing more about this organization, so I figured I’d start first by looking at its corporate filings. Color Us United calls itself a 501c(3), so it should have some paperwork somewhere. Let’s do a search on the North Carolina Secretary of State’s website:

Oof. Nothing here.

OK, well then it’s a charity, right? Let’s look to see if it has a charitable solicitation license on the NC Secretary of State site:

Hmm. No charitable solicitation license, either!

Interesting, huh? The Secretary of State does explain that not all charities are required to obtain a charitable solicitation license. Notably, those charities that bring in less that $25,000 a year and don’t compensate anyone are exempt from the license:

Color Us United does solicit donations on its website, where it tells potential donors that its EIN is 85-0513810.

EINs are a handy way to uncover more documents about an organization. A Google Search on 85-0513810 gives us:

So, Color Us United does not appear to be the organization that was registered with the 85-0513180 EIN. Instead, the EIN appears to belong to a nonprofit called “Center for Race and Opportunity,” that seems to have two addresses associated with it: one at PO Box 314, Kingston, NJ; and one at 3781 Westerre Pkwy Suite F, Henrico, VA. The Henrico address appears to belong to a coworking space, according to a check of Google StreetView.

It’s possible that Color Us United has a Doing Business As (DBA) legal arrangement but I have not found anything yet to indicate this.

The Center for Race and Opportunity is/was a Virginia nonprofit that Mr. Xu apparently created, according to its entry in the opencorporates website.

.. and the IRS:

It appears from entries on the opencorporates and the Virginia-Company website that the Center for Race and Opportunity is in danger of being inactivated, possibly due to some late filings. The only Form 990-N entry shown on the IRS website is one filed for the 2020 tax year.

So what does this all mean? It seems at first glance that the Center for Race and Opportunity and the Color Us United organizations are small potatoes (if they are, indeed, individual organizations – it is hard to tell). It would be tempting to dismiss this as the hobby of a privileged but misguided kid but one thing really stands out here: the Color Us United group has spent over $100,000 on political Facebook ads over the last four years.

Here’s one of the many ads Color Us United ran, this one attacking the Salvation Army and reaching an audience greater than 1 million people and costing $4,000 – $5,000. That five grand could have made for a nice Christmas for some needy families but instead it went to defending racists. Lovely.

A hundred grand ain’t small potatoes, especially to your typical 24-year-old kid. So where is this money coming from? It’s really hard to say as there are few resources available online to track it. I have been unable to find Color Us United listed in any of the databases of the North Carolina Secretary of State. The organization tied to the EIN that Color Us United is using to claim its non-profit status has apparently not filed a recent Form 990N and is listed in databases as pending inactivation. And while a charity that spends $100k in four years on Facebook ads could conceivably be taking in less than $25,000 per year, in my mind this is very close to crossing a line – if not completely obliterating it. I am skeptical that Color Us United/Center for Race and Opportunity is taking in less than $25,000 per year and is not compensating anyone, especially considering that being president of the organization is the only job Mr. Xu lists on his LinkedIn profile at the moment.

Who is paying the bills at Color Us United? Color me unsure. I think it deserves more scrutiny.

Tarus Balog : 2022 Open Source Summit – Day 3

June 24, 2022 08:01 PM

Thursday at the Open Source Summit started as usual at the keynotes.

Picture of Robin Bender Ginn on stage

Robin Bender Ginn opened today’s session with a brief introduction and then we jumped into the first session by Matt Butcher of Fermyon.

Picture of Matt Butcher on stage

I’ve enjoyed these keynotes so far, but to be honest nothing has made me go “wow!” as much as this presentation by Fermyon. I felt like I was witnessing a paradigm shift in the way we provide services over the network.

To digress quite a bit, I’ve never been happy with the term “cloud”. An anecdotal story is that the cloud got its name from the fact that the Visio icon for the Internet was a cloud (it’s not true) but I’ve always preferred the term “utility computing”. To me cloud services should be similar to other utilities such as electricity and water where you are billed based on how much you use.

Up until this point, however, instead of buying just electricity it has been more like you are borrowing someone else’s generator. You still have to pay for infrastructure.

Enter “serverless“. While there are many definitions of serverless, the idea is that when you are not using a resource your cost should be zero. I like this definition because, of course, there have to be servers somewhere, but under the utility model you shouldn’t be paying for them if you aren’t using them. This is even better than normal utilities because, for example, my electricity bill includes fees for things such as the meter and even if I don’t use a single watt I still have to pay for something.

Getting back to the topic at hand, the main challenge with serverless is how do you spin up a resource fast enough to be responsive to a request without having to expend resources when it is quiescent? Containers can take seconds to initialize and VMs much longer.

Fermyon hopes to address this by applying Webassembly to microservices. Webassembly (Wasm) was created to allow high performance applications, written in languages other than Javascript, to be served via web pages, although as Fermyon went on to demonstrate this is not its only use.

The presentation used a game called Finicky Whiskers to demonstrate the potential. Slats the cat is a very finicky eater. Sometimes she wants beef, sometimes chicken, sometimes fish and sometimes vegetables. When the game starts Slats will show you an icon representing the food they want, and you have to tap or click on the right icon in order to feed it. After a short time, Slats will change her choice and you have to switch icons. You have 30 seconds to feed as many correct treats as possible.

Slide showing infrastructure for Frisky Kittens: 7 microservices, Redis in a container, Nomad cluster on AWS, Fermyon

Okay, so I doubt it will have the same impact on game culture as Doom, but they were able to implement it using only seven microservices, all in Wasm. There is a detailed description on their blog, but I liked that fact that it was language agnostic. For example, the microservice that controls the session was written in Ruby, but the one that keeps track of the tally was written in Rust. The cool part is that these services can be spun up on the order of a millisecond or less and the whole demo runs on three t2.small AWS instances.

This is the first implementation I’ve seen that really delivers on the promise of serverless, and I’m excited to see where it will go. But don’t let me put words into their mouth, as they have a blog post on Fermyon and serverless that explains it better than I could.

Picture of Carl Meadows on stage

The next presentation was on OpenSearch by Carl Meadows, a Director at AWS.

Note: Full disclosure, I am an AWS employee and this post is a personal account that has not been endorsed or reviewed by my employer.

OpenSearch is an open source (Apache 2.0 licensed) set of technologies for storing large amounts of text that can then be searched and visualized in near real time. Its main use case is for making sense of streaming data that you might get from, say, log files or other types of telemetry. It uses the Apache Lucene search engine and latest version is based on Lucene 9.1.

One of the best ways to encourage adoption of an open source solution is by having it integrate with other applications. With OpenSearch this has traditionally been done using plugins, but there is a initiative underway to create an “extension” framework.

Plugins have a number of shortcomings, especially in that they tend to be tightly coupled to a particular version of OpenSearch, so if a new version comes out your existing plugins may not be compatible until they, too, are upgraded. I run into this with a number of applications I use such as Grafana and it can be annoying.

The idea behind extensions is to provide an SDK and API that are much more resistant to changes in OpenSearch so that important integrations are decoupled from the main OpenSearch application. This also provides an extra layer of security as these extensions will be more isolated from the main code.

I found this encouraging. It takes time to build a community around an open source project but one of the best ways to do it is to provide easy methods to get involved and extensions are a step in the right direction. In addition, OpenSearch has decided not to require a Contributor License Agreement (CLA) for contributions. While I have strong opinions on CLAs this should make contributing more welcome for developers who don’t like them.

Picture of Taylor Dolezal on stage

The next speaker was Taylor Dolezal from the Cloud Native Computing Foundation (CNCF). I liked him from the start, mainly because he posted a picture of his dog:

Slide of a white background with the head and sad eyes of a cute black dog

and it looks a lot like one of my dogs:

Picture of the head of my black Doberman named Kali

Outside of having a cool dog, Dolezal has a cool job and talked about building community within the CNCF. Just saying “hey, here’s some open source code” doesn’t mean that qualified people will give up nights and weekends to work on your project, and his experiences can be applied to other projects as well.

The final keynote was from Chris Wright of Red Hat and talked about open source in automobiles.

Picture of Chris Wright on stage

Awhile ago I actually applied for a job with Red Hat to build a community around their automotive vertical (I didn’t get it). I really like cars and I thought that combining that with open source would just be a dream job (plus I wanted the access). We are on the cusp of a sea change with automobiles as the internal combustion engine gives way to electric motors. Almost all manufacturers have announced the end of production for ICEs and electric cars are much more focused on software. Wright showed a quote predicting that automobile companies will need four times the amount of software-focused talent that the need now.

A slide with a quote stating that automobile companies will need more than four times of the software talent they have now

I think this is going to be a challenge, as the automobile industry is locked into 100+ years of “this is the way we’ve always done it”. For example, in many states it is still illegal to sell cars outside of a dealership. When it comes to technology, these companies have recently been focused on locking their customers into high-margin proprietary features (think navigation) and only recently have they realized that they need to be more open, such as supporting Android Auto or CarPlay. As open source has disrupted most other areas of technology, I expect it to do the same for the automobile industry. It is just going to take some time.

I actually found some time to explore a bit of Austin outside the conference venue. Well, to be honest, I went looking for a place to grab lunch and all the restaurants near the hotel were packed, so I decided to walk further out.

Picture of the wide Brazos river from under the Congress Avenue bridge

The Brazos River flows through Austin, and so I decided to take a walk on the paths beside it. The river plays a role in the latest Neal Stephenson novel called Termination Shock. I really enjoyed reading it and, spoiler alert, it does actually have an ending (fans of Stephenson’s work will know what I’m talking about).

I walked under the Congress Avenue bridge, which I learned was home to the largest urban bat colony in the world. I heard mention at the conference of “going to watch the bats” and now I had context.

A sign stating that drones were not permitted to fly near the bat colony under the Congress Avenue bridge

Back at the Sponsor Showcase I made my way over to the Fermyon booth where I spent a lot of time talking with Mikkel Mørk Hegnhøj. When I asked if they had any referenceable customers he laughed, as they have only been around for a very short amount of time. He did tell me that in addition to the cat game they had a project called Bartholomew that is a CMS built on Fermyon and Wasm, and that was what they were using for their own website.

Picture the Fermyon booth with people clustered around

If you think about it, it makes sense, as a web server is, at its heart, a fileserver, and those already run well as a microservice.

They had a couple of devices up so that people could play Finicky Whiskers, and if you got a score of 100 or more you could get a T-shirt. I am trying to simplify my life which includes minimizing the amount of stuff I have, but their T-shirts were so cool I just had to take one when Mikkel offered.

Note that when I got back to my room and actually played the game, I came up short.

A screenshot of my Finicky Whiskers score of 99

The Showcase closed around 4pm and a lot of the sponsors were eager to head out, but air travel disruptions affected a lot of them. I’m staying around until Saturday and so far so good on my flights. I’m happy to be traveling again but I can’t say I’m enjoying this travel anxiety.

[Note: I overcame by habit of sitting toward the back and off to the side so the quality of the speaker pictures has improved greatly.]

Tarus Balog : 2022 Open Source Summit – Day 2

June 23, 2022 05:48 PM

The word for Day 2 of the Open Source Summit is SBOM.

When I first heard the term my thought was that someone had spoken a particular profanity at an inappropriate time, but SBOM in this context means “Software Bill of Materials”. Open source is so prevalent these days that it is probably included in a lot of the software you use and you may not be aware of it, so when an issue is discovered such as Log4shell it can be hard to determine what software is affected. The idea of asking all vendors (both software-only and software running on devices) to provide an SBOM is a first step to being able to audit this software.

It isn’t as easy as you might think. The OpenNMS project I was involved with used over a hundred different open source libraries. I know because I once did a license audit to make sure everything being used had compatible licenses. I also have used Black Duck Software (now Synopsys) to generate a list of included software, and it looks like they now offer SBOM support as well, but I get ahead of myself.

Note that Synopsys is here in the Sponsor Showcase but when I stopped by the booth no one was there.

Getting back to the conference, the second morning keynotes were more sparsely attended than yesterday, but the room was far from empty. The opening remarks were given by Mike Dolan, SVP and GM of Projects at the Linux Foundation, and he was a last minute replacement for Jim Zemlin, who was not feeling well.

Picture of Mike Dolan on stage

Included in the usual housekeeping announcements was a short “in memoriam” for Shubhra Kar, the Linux Foundation CTO who passed away unexpectedly this year.

Dolan also mentioned that the Software Package Data eXchange (SPDX) open standard used for creating SBOMs had turned 10 years old (and it looks like it will hit 11 in August). This was relevant because with applications of any complexity including hundreds if not thousands of open source software projects, there had to be some formal way of listing them for analysis in an SBOM, and most default to SPDX.

The next speaker was Hilary Carter who is in charge of research for the Linux Foundation.

Picture of Mike Dolan and Hilary Carter on stage

She spoke on the work the Linux Foundation is doing to measure the worldwide impact of open source. As part of that she mentioned that there is a huge demand for open source talent in the market place, but there are also policy barriers for employees of many companies to contribute to open source. She also brought up SBOMs as a way to determine how widespread open source use is in modern applications.

Stylized Mercator Map Projection

Since diversity has been a theme at this conference I wanted to address a pet peeve of mine. This is a slide from Carter’s presentation and it uses a stylized Mercator projection to show the world. I just think it is about time we stop using this projection, as the continent highlighted, Africa, is actually much, much larger in proportion to the other continents than is shown on this map. As an alternative I would suggest the Gall-Peters projection.

Gall-Peters projection of the world yoinked from Wikipedia

To further digress, I asked my friend Ben to run “stylized Gall-Peters projection” through Midjourney but I didn’t feel comfortable posting any of the results (grin).

Anyway, enough of that. The next presenter was Kevin Jakel, who founded Unified Patents.

Picture of Kevin Jakel on stage

The goal of Unified Patents is to protect open source from patent trolls. Patent trolls are usually “non-practicing entities” who own a lot of patents but exist to extract revenue from companies they believe are infringing upon them versus building products. Quite frequently it is cheaper to settle than pursue legal action against these entities and this just encourages more actions on the part of the trolls.

The strategy to combat this is described as “Detect, Disrupt and Deter”. For a troll, the most desired patents are ones that are broad, as this means more companies can be pursued. However, overly broad patents are also subject to review, and if the Patent and Trademark Office is convinced a patent isn’t specific enough it can invalidate it, destroying the revenue stream for the patent troll.

I’m on the fence over software patents in general. I mean, let’s say a company could create a piece of software that exactly modeled the human body and how a particular drug would interact with it, I think that deserves some protection. But I don’t think that anyone owns the idea of, say, “swipe left to unlock”. Also it seems like software rights could be protected by copyright, but then again IANAL (one source for more information on this is Patent Absurdity)

Picture of Amir Montezary on stage

The next person on stage was Amir Montazery, of the Open Source Technology Improvement Fund. The mission of the OSTIF is to help secure open source software. They do this through both audits and fundraising to provide the resources to open source projects to make sure their software is secure as possible.

Jennings Aske, of New York-Presbyterian Hospital spoke next. I have worked a bit with technology in healthcare and as he pointed out there are a lot of network connected devices used in medicine today, from the devices that dispense drugs to the hospital beds themselves. Many of those do not have robust security (and note that these are proprietary devices). Since a hack or other breach could literally be a life and death situation, steps are being taken to mitigate this.

Picture of Jennings Aske on stage

I enjoyed this talk mainly because it was from the point of view of a consumer of software. As customers are what drive software revenues, they stand the best chance in getting vendors to provide SBOMs, along with government entities such as the National Telecommunications and Information Administration (NTIA). The NTIA has launched an effort called Software Component Transparency to help with this, and Jennings introduced a project his organization sponsors called DaggerBoard that is designed to scan SBOMs to look for vulnerabilities.

Picture of Arun Gupta on stage

The next keynote was from Arun Gupta of Intel. His talk focused on building stronger communities and how Intel was working to build healthy, open ecosystems. He pointed out that open source is based largely on trust, which is an idea I’ve promoted since I got involved in FOSS. Trust is something that can’t be bought and must be earned, and it is cool to see large companies like Intel working toward it.

Picture of Melissa Smolensky on stage

The final presenter was Melissa Smolensky from Gitlab who based her presentation around a “love letter to open source”. It was cute. I too have a strong emotional connection to my involvement in free and open source software that I don’t get anywhere else in my professional life, at least to the same degree.

I did get to spend some time near the AWS booth today, and after chatting at length with the FreeRTOS folks I happened to be nearby when Chris Short did a presentation on GitOps.

Chris Short presenting GitOps

In much the same way that Apple inspired a whole generation of Internet-focused products to put an “i” in front of their name, DevOps has spawned all kinds of “Ops” such as AIOps and MLOps and now GitOps. The idea of DevOps was built around creating processes to more closely tie software development to software operation and deployment, and key to this was configuration management software such as Puppet and Ansible. Instead of having to manage configuration files per instance, one could store them centrally and use agents to deploy them into the environment. This central repository allows for a high degree of control and versioning.

It is hard to think of a better tool for versioning than git, and thus GitOps was born. Software developed using GitOps is controlled by configuration files (usually in YAML) and using git to make changes.

While I am not an expert on GitOps by any means, suppose your application used a configuration file to determine the various clusters to create. To generate a new cluster you would just edit the file in your local copy of the repo, git commit and git push.

You application would then use something like Flux (not to be confused with the Flux query language from InfluxData) to note that a change has occurred and then do a git pull which would then cause the change to be applied.

Pretty cool, huh? A lot of people are familiar with git so it makes the DevOps learning curve a lot less steep. It also allows for the configuration of multiple repositories so you can control, say, access to secrets differently than the main application configuration.

Spot Callaway and Brian Proffitt

Also while I was in the booth I got this picture of two Titans of Open Source, Spot Callaway and Brian Proffitt. Oh yeah.

My final session of the day was given by Kelly O’Malley of Databricks on Delta Lake.

Kelly O'Malley presenting on Delta Lake

Now as someone who has given a lot of talks, I try to be respectful of the presenter and with the exception of the occasional picture and taking notes I try to stay off my phone. I apologized to her afterward as I was spending a lot of time looking up terms with which I was unfamiliar, such as “ACID” and “parquet“.

Delta Lake is an open source project to create a “Lakehouse”. The term is derived from a combination of “Data Warehouse” and “Data Lake“.

Data warehouses have been around for a very long time (in one of my first jobs I worked for a VAR that built hardware solutions for storing large data warehouses) and the idea was to bring together large amounts of operational data into one place so that “business intelligence” (BI) could be applied to help make decisions concerning the particular organization. Typically this data has been very structured, such as numeric or text data.

But people started figuring out that a lot of data, such as images, needed to be stored in more of a raw format. This form of raw data didn’t lend itself well to the usual BI analysis techniques.

Enter Delta Lake. Based on Apache Spark, it attempts to make data lakes more manageable and to make them as useful as data warehouses. I’m eager to find the time to learn more about this. When I was at OpenNMS we did a proof of concept about using Apache Spark to perform anomaly detection and it worked really well, so I think it is perfectly matched to make data lakes more useful.

My day ended at an internal event sponsored by Nithya Ruff, who in addition to being the chairperson of the Linux Foundation is also the head of the AWS OSPO. I made a number of new friends (and also got to meet Amir Montazery from the morning keynotes in person) but ended up calling it an early night because I was just beat. Eager to be fresh for the next day of the conference.

Tarus Balog : 2022 Open Source Summit – Day 1

June 22, 2022 06:08 PM

The main activities for the Open Source Summit kicked off on Tuesday with several keynote sessions. The common theme was community and security, including the Open Source Security Foundation (OpenSSF).

The focus on security doesn’t surprise me. I was reminded of this xkcd comic when the Log4shell exploit hit.

An xkcd comic showing how complex digital architecture depends on little known, small projects

At the time I was consulting for a bank and I called the SVP and said “hey, we really need to get ahead of this” and he was like “oh, yeah, I was invited to a security video call a short while ago” and I was like “take the call”.

I managed to squeeze into the ballroom just before the talks started, and I was happy to see the room was packed, and would end up with a number of people standing in the back and around the edges.

People in the hotel ballroom watching the keynote presentations

The conference was opened by Robin Bender Ginn, Executive Director of the OpenJS Foundation.

Picture of Robin Bender Ginn on stage

After going over the schedule and other housekeeping topics, she mentioned that in recognition of Pride Month the conference was matching donations to the Transgender Education Network of Texas (TENT) as well as Equality Texas, up to $10,000.

In that vein the first person to speak was Aeva Black, and they talked about how diversity can increase productivity in communities, specifically open source communities, by bringing in different viewpoints and experiences. It was very well received, with many people giving a standing ovation at its conclusion.

Picture of Aeva Black on stage

The next speaker was Eric Brewer from Google (a platinum sponsor) and his talk focused on how to improve the robustness and security of open source (and he joked about having to follow Black with such a change of topic). Free software is exactly that, free and “as is”. So when something like Log4shell happens that impacts a huge amount of infrastructure, there is really no one who has an implicit obligation to rectify the issue. That doesn’t prevent people from trying to force someone to fix things, as this infamous letter to Daniel Stenberg demonstrates.

Picture of Eric Brewer on stage

Brewer suggests that we work on creating open source “curators” who can provide commercial support for open source projects. In some cases they could be the maintainer, but it is not necessary. When I was at OpenNMS our support offerings provided some of this indemnification along with service levels for fixing issues, but of course that came at a cost. I think it is going to take some time for people to realize that free software does not mean a free solution, but this idea of curators is a good start.

I got the feeling that the next presentation was one reason the hall was so packed as Linus Torvalds and Dirk Hohndel took the stage. Linus will be the first to admit that he doesn’t like public speaking, but I found that this format, where Dirk asked him questions and he responded, worked well. Linus, who is, well, not known for suffering fools gladly, admitted and apologized for his penchant for being rather sharp in his criticism, and when Dirk asked if he was going to be nicer in the future Linus said, no, he probably wouldn’t so he wanted to proactively apologize. That made me chuckle.

Picture of Linus Torvalds and Dirk Hohndel on stage

This was followed by a security-focused presentation by Todd Moore from IBM, another platinum sponsor. He also addressed trying to improve open source security but took an angle more aimed at government involvement. Digital infrastructure is infrastructure, much like bridges, roads, clean water, etc., and there should be some way for governments to fund and sponsor open source development.

Picture of Todd Moore on stage

The final keynote for today was a discussion with Amy Gilliland who is the President of General Dynamics Information Technology (GDIT). In a past life I worked quite a bit with GDIT (and you have to admit, that can be a pretty appropriate acronym at times) and it is nice to see a company that is so associated with more secretive aspects of government contracting focusing on open source solutions.

Picture of Amy Gilliland on stage

After the keynotes I visited the Sponsor Hall to see the AWS booth. It was pretty cool. As a diamond sponsor it is right in front as you enter.

AWS Booth in the Sponsor Hall

There were people from a number of the open source teams at AWS available to do presentations, including FreeRTOS and OpenSearch.

People in the Sponsor Hall

I don’t have booth duty this conference so I decided to wander around. I thought it was laid out well and it was interesting to see the variety of companies with booths. I did take some time to chat with the folks at Mattermost.

Mattermost Booth in the Sponsor Hall

While I’m a user of both Discord and Slack, I really, really like Mattermost. It is open source and provides a lot of the same functionality as Slack, and you can also host it yourself which is what the OpenNMS Project does. If you don’t want to go to the trouble of installing and maintaining your own instance, you can get the cloud version from Mattermost, and I learned that as of version 7 there is a free tier available so there is nothing preventing you from checking it out.

A selfie featuring me and whurley

I did take a short break from the conference to grab lunch with my friend William Hurley (whurley). It had been at least three years since we’d seen each other face to face and, thinking back, I was surprised at the number of topics we managed to cover in our short time together. He is an amazing technologist currently working to disrupt, and in many ways found, commercial quantum computing through his company StrangeWorks. He also made me aware of Amazon Braket, which lets those of us who aren’t whurley to access quantum computing services. I’m eager to check it out as it is an area that really interests me.

After lunch I was eager to see a presentation on InfluxDB by Zoe Steinkamp.

A picture of Zoe Steinkamp presenting on InfluxDB

Time series data collection and storage was a focus of mine when I was involved in monitoring, and Influx is working to make flexible solutions using open source. Steinkamp’s presentation was on combining data collection at the edge with backend storage and processing in the cloud. Influx had a working example of a device that would monitor the conditions of a plant (she’s an avid gardener) such as temperature and moisture, and this data was collected locally and then forwarded to the cloud. They have a new technology called Edge Data Replication designed to make the whole process much more robust.

I was excited to learn about their query language. Many time series solutions focus so much on obtaining and storing the data and not enough on making that data useful, which to me seems to be the whole point. I’m eager to play with it as soon as I can.

One thing that bothered me was that the hotel decided to have the windows washed in the middle of the presentation.

A picture a window washer

Steinkamp did a great job of soldiering through the noise and not letting it phase her.

The evening event was held at Stubbs restaurant, which is also a music venue.

The Stubbs Restaurant sign feature a billboard welcoming the Open Source Summit

I’ve been a fan of Stubbs barbecue sauce for years so it was cool to go to the restaurant that bears his name, even though the Austin location was opened in 1996, a year after Christopher B. Stubblefield died.

It was a nice end to a busy day, and I look forward to Day 2.

Tarus Balog : 2022 Open Source Summit – Day 0

June 21, 2022 01:15 PM

Monday was a travel day, but it was notable as it was the first time I have been in an airport since August. I fly out of RDU, and the biggest change was that they now have the “Star Trek” x-ray machines to scan carry-on luggage. While I was panicked for a second when I downloaded my boarding pass and didn’t see the TSA Precheck logo, I was able to get that sorted out so going through security was pretty easy.

The restrictions on masks for air travel have been lifted, but I wore mine along with about 10% of the other travelers. Even though I’ve had four shots and a breakthrough case of COVID I do interact with a lot of older people and since I’ll be around the most people in years at the Open Source Summit I figured I’d wear mine throughout the trip.

And while it isn’t N95, being a car nut I tried out these masks from K&N Engineering, who are known for high end air filtration for performance vehicles, and you almost don’t realize you are wearing a mask.

Anyway, I made my way to the Admiral’s Club and was pleasantly surprised to see it wasn’t very crowded. It was nice to have the membership (it comes with my credit card) as my flight to Charlotte was delayed over 90 minutes. I wasn’t too worried since I had a long layover before heading to Austin, so I was a lot less stressed than many of my fellow travelers.

The flight to Austin left on time and landed early, but we got hit with the curse in that our gate wasn’t available, so we ended up on the tarmac for 45 minutes, getting in 30 minutes late.

Not that I’m complaining. Seriously, according to my handy the trip from my home to Austin by car is 19 hours. From the moment I left my home until we landed was more like 8 hours, and most of that was enjoyable. I always have to remind myself of this wonderful clip by Louis CK which kind of sums up the amazing world in which we live where every time we fly we should be saying to ourselves “I’m in a chair in the sky!”

I checked in at the hotel and then we headed back out in our rented minivan to get the last member of our team, and then we drove about 45 minutes outside of Austin to this barbecue joint called Salt Lick in Driftwood Texas. It was wonderful and I was told we owed this experience to a recommendation years ago from Mark Hinkle, so thanks Mark!

A white van in the parking lot of the Salt Lick barbecue restaurant

You can’t really tell a good barbecue restaurant by its looks, although shabbier tends to be better, but more by the smell. When you get out of your vehicle your nose is so assaulted with the most wonderful smell you might be drawn to the entrance so quickly that you miss the TARDIS.

A British Police box that looks like the TARDIS from Doctor Who in the parking lot of the Salt Lick barbecue restaurant

We sat at a big picnic table and ordered family style, which was all you could eat meat, slaw, baked beans, bread, pickles and potato salad. I was in such a food coma by the end that I forgot to take a picture of the cobbler.

A table full of food at the Salt Lick barbecue restaurant

I tried not to fall asleep on the ride back to Austin (I wasn’t driving) but it was a great start to what I hope is a wonderful week.

Tarus Balog : 2022 Open Source Summit North America

June 16, 2022 04:41 PM

Next week I’ll be attending my first conference in nearly three years. My last one turned out to be the very last OSCON back in 2019. Soon after that I was in a bad car accident that laid me up for many months and then COVID happened.

Open Source Summit Logo Showing Member Conferences

I am both eager and anxious. Even having four vaccine shots and one breakthrough case I still feel a little exposed around large groups of people, but the precautions outlined in the “Health and Safety” section of the conference website are pretty robust and I am eager to see folks face-to-face (or mask-to-mask) once again.

The Linux Foundation’s Open Source Summit used to be known as Linuxcon and now it is an umbrella title for a number of conferences around open source, all of which look cool. My new employer, AWS, is a platinum sponsor and will also have a booth (I am not on booth duty this trip but I’ll be around). I am looking forward to getting to meet in person many of my teammates who I’ve only seen via video, old friends I haven’t seen in years, and to making a bunch of new ones.

Of course, we would have to have a conference in Austin during a heat wave. I was thinking about never leaving the conference venue but then I remembered … barbecue.

If you are going and would like to say “hi” drop me a note on Twitter or LinkedIn or send an e-mail to tarus at tarus dot io.

Tarus Balog : In Pursuit of Quality Interactions

June 15, 2022 01:25 PM

Recently my friend Jonathan had a birthday, and I sent him a short note with best wishes for the day and to let him know I was thinking about him.

In his reply he included the following paragraph:

[I] was reminded of your comment about a sparsely attended OUCE conference at Southampton one year. You said something along the lines of that it didn’t matter, that you would try to make it the best experience you could for everyone there. That stuck with me. It’s been one of my mantras ever since then.

I can remember talking about that, although I also remember I was very ill during most of that conference and spent a lot of time curled up in my room.

Putting on conferences can be a challenge. You don’t know how many people will show up, but you have to plan months in advance in order to secure a venue. Frequently we could use information about the previous conference to approximate the next one, but quite often there were a number of new variables that were hard to measure. In this case moving the conference from Germany, near Frankfurt, to Southampton in the UK resulted in a lot less people coming than we expected.

It is easy to get discouraged when this happens. I have given presentations in full rooms where people were standing in the back and around the edges, and I have given presentations to three people in a large, otherwise empty room. In both cases I do my best to be engaging and to meet the expectations of those people who were kind enough to give me their attention.

I think this is important to remember, especially in our open source communities. I don’t think it is easy to predict which particular people will become future leaders on first impressions, so investing a little of your attention in as many people as possible can reap large results. I can remember when I started in open source I’d sometimes get long e-mails from people touting how great they were, which was inevitably followed up with a long list of things I needed to do to make my project successful. Other times I’d get a rather timid e-mail from someone wanting to contribute, along with some well written documentation or a nice little patch or feature, and I valued those much more.

I can remember at another OUCE we ended up staying at a hotel outside of Fulda because another convention (I think involving public service vehicles like fire trucks and ambulances) was in town at the same time. There was a van that would pick us up and take us into town each morning, and on one day a man named Ian joined me for the ride. He was complaining about how his boss made him come to the conference and he was very unhappy about being there. I took that as a challenge and spent some extra time with him, and by the end of the event he had become one of the project’s biggest cheerleaders.

Or maybe it was just the Jägermeister.

In the book Zen and the Art of Motorcycle Maintenance the author Robert Persig demonstrates a correlation between “attention” and “quality”. In today’s world I often find it hard to focus my attention on any one thing at a time, and it is something I should improve. But I do manage to put a lot of attention into person-to-person interactions, and that has been very valuable over the years.

In any case I was touched that Jonathan remembered that from our conversation, and it helps to be reminded. It also motivated me to write this blog post (grin).

Tarus Balog : AWS: Impressions So Far

June 08, 2022 04:20 PM

When I announced that I had joined AWS, at least two of my three readers reached out with questions so I thought I’d post an update on my onboarding process and impressions so far.

One change you can expect is that when I talk about my job on this blog, I’m going to add the following disclaimer:

Note: Everything expressed here represents my own thoughts and opinions and I am not speaking for my employer Amazon Web Services.

Back when I owned the company I worked for I had more control about what I could share publicly. While I am very excited to be working for AWS and may, at some time in the future, speak on their behalf, this is not one of those times.

A number of people joked about me joining the “dark side”. My friend Talal even commented on my LinkedIn post with the complete “pitch speech” Darth Vader made to Luke Skywalker in Empire. While I got the joke I’d always had a pretty positive opinion of Amazon, gained mainly through being a retail customer.

I recently went and traced what I think to be my first interaction with Amazon back to a book purchase made in December of 1997. In the nearly 25 years I’ve been shopping there I can think of only two times that I was disappointed with their customer service (both involving returns) and numerous times when my expectations were exceeded by Amazon. For example, I once spent around $70 on two kits used to clean high performance automotive air filters. In shipment one of them leaked, and I asked if I could return it. They told me to keep both and refunded the whole $70, even after I protested that I’d be happy with half that.

It was this focus on customer service that attracted me to the possibility of working with Amazon. When I was at OpenNMS I crafted a mission statement that read “Help Customers. Have Fun. Make Money”. I thought I came up with it on my own but I may have gotten inspiration from a Dilbert cartoon, although I changed the order to put the focus on customers. I always put a high value on customer satisfaction.

I have also been a staunch, and I’ll admit, opinionated, proponent of free and open source software and nearly 20 years of those opinions are available on this blog. Despite that, AWS still wanted to talk to me, and as I went through the interview process I really warmed to the idea of working on open source at AWS.

Just before I started I received a note from the onboarding specialist with links to content related to Amazon’s “peculiar” culture. When I read the e-mail I was pretty certain they meant “particular”, as “particular” implies “specific” and “peculiar” implies “strange”. Nope, peculiar is the word they meant to use and I’m starting to understand why. They are so laser-focused on customer satisfaction that their methods can seem strange to people used to working in other companies.

As you can imagine with a company that has around 1.6 million employees, they have the onboarding process down to a science. My laptop and supporting equipment showed up before my start date, and with few problems I was able to get on the network and access Amazon resources. These last two weeks have been packed with meeting people, attending virtual classes with other new hires, and going through a lot of online training. One concept they introduce early on is the idea of “working backwards”. At Amazon, everything starts from the customer and you work backwards from there. After having this drilled into my head in one of the online courses it was funny to watch a video of Jeff Bezos during an All Hands meeting where someone asks if the “working backwards” process is optional.

Based on my previous experience with large companies I was certain of the answer: no, working backwards is not optional. Period.

But that wasn’t what he said. He said it wasn’t optional unless you can come up with something better. I know it is kind of a subtle distinction but it really resonated with me, as it drove home the fact that at Amazon no process is really written in stone. Everything is open to change if it can be improved. As I learn more about Amazon I’ve found that there are many “tenets”, or core principles, and every one of them is presented in the context that these exist until something better is discovered, and there seem to be a lot of processes in place to suggest those improvements at all levels of the company.

If there is anything that isn’t open to change, it is the goal of becoming the world’s most customer-centric company. While a lot of companies can claim to be focused on their customers without many specifics, at Amazon this is defined has having low prices, large selection and a great customer experience. Everything else is secondary.

I bring this up because it is key to understanding Amazon as a company. To get back to my area of expertise, open source, quite frequently open source involvement is measured by things such as number of commits, lines of code committed, number of projects sponsored and number of contributors. That is all well and good but seen through the lens of customer satisfaction they mean nothing, so they don’t work at Amazon. Amazon approaches open source as “how can our involvement improve the experience of our customers?”

(Again, please remember that is my personal opinion based on my short tenure at AWS and doesn’t constitute any formal policy or position)

Note that with respect to open source at AWS, “customer” can refer to both end users of software who want an easy and affordable way to leverage open source solutions as well as open source projects and companies themselves. My focus will be on the latter and I’m very eager to begin working with all of these cool organizations creating wonderful open source solutions.

This focus may not greatly increase those metrics mentioned above, but it is hoped that it will greatly increase customer satisfaction.

So, overall, I’m very happy with my decision to come to AWS. I grew up in North Carolina where the State motto is Esse Quam Videri, which is Latin for “to be rather than to seem”. My personal goal is to see AWS considered both a leader and an invaluable partner for open source companies and projects. I realize that won’t happen overnight and I welcome suggestions on how to reach that goal. In any case it looks like it is going to be a lot of fun.

Mark Turner : A return to recording engineering

June 08, 2022 01:51 AM

A side effect of my work on singing has been discovering what tools I need to sound decent. I started with a very good USB microphone a few years ago and then graduated to an inexpensive, 8-channel USB mixer board that I could use with some decent XLR mics I had lying around. When I got my current job, I went out and bought a top-of-the-line Shure SM7B microphone and paired it with my mixer, which got me even closer to the professional sound I wanted. Then I found a used digital sound card, an 8-channel Firewire-based M-Audio 2626 and bought it cheap.

Now, Firewire is essentially an abandoned technology now that Apple no longer ships systems with it, but it is still alive and well in Linux. I took one of my old desktop PCs out of storage, added a hard drive, installed Ubuntu Studio on it, and now have a digital audio workstation (DAW), for dirt cheap! Ubuntu Studio comes with a huge number of audio and video production tools and plugins. It works just fine with this very old M-Audio 2626, too.

My audio tool of choice for editing was once Audacity, but Ubuntu Studio comes with the open-source, ProTools-like DAW called Ardour. I’ve learned how to do some amazing things with manipulating audio using Ardour, simply by diving in and trying different things. I’m sure there is at last 200% more I can be doing with it when I fully understand its capabilities.

Over the past few days and nights, I’ve spent my free time using Ardour to recreate one of my favorite songs, R.E.M.’s These Days. I’ve often looked for old-school karaoke tracks for R.E.M. but there are few that aren’t the hits everyone’s heard a million times already. I did some Google searches to see if anyone’s done this themselves and hit pay dirt when I found a musician named Clive Butler. Clive posted several of his R.E.M. covers to Blogger from 2011-2018 and I thought I’d start with those. Then last week, I discovered he has fresh versions on his very own YouTube channel so I downloaded his version of These Days.

Another fortuitous find was the work of a YouTube user named BaldAndroid. BaldAndroid has remixed many R.E.M. albums, bringing to the fore instruments and voices that were once buried in the release mix. I’ve been able to suss out parts in These Days that I could never make out before, and have used this guide to recreate these sounds in my own version. Suddenly, my two-track project has ballooned to 7 or more tracks, but such is the nature of professional recording. I was happy this afternoon when I put what may be the finishing touches on the project, proud of how closely mine sounds to the original. Then again, I do laugh when I realize what I’ve manged to do is recreate what was state of the art over FORTY YEARS AGO! I still have a lot to learn, obviously!

The little recording room I’ve built right off my office has become my latest happy place. I can close the door, slip on the headphones, and get lost in the recording process. I can spend hours there, cutting and recutting takes, adding effects, getting the timing and levels right, and all the other stuff that goes into making something sound great. In a way I’ve come full circle now, having started off as a recording engineer at Sing-A-Song Recording Studio at Carowinds back in 1987. I’m having fun seeing what more I can do with this.

Mark Turner : Playing in a band – DNR

June 08, 2022 01:23 AM

As I mentioned previously, I’d taken my singing much more seriously over the last few years, practicing for hours each week to improve my technique. At the end of last year, I got good enough to post a few audio clips and videos on a bandmate-finding website called BandMix. It took about a week before a few bands reached out to me, interested to talk to me about fronting their bands. I said yes to one which was a new Creedence Clearwater Revival tribute band but we never rehearsed because of a surge in COVID at the time. I wound up leaving the band and it kind of broke up soon afterward. Then I got interest from a Beatles tribute band, too, but didn’t think the music was varied or interesting enough. Finally, a musician reached out who was interested in the same music I was – and it was across the gamut of styles. My interest was piqued!

In Beaufort, NC, tagging along on one of Kelly’s work trips at the end of December, I got a call from Chuck, the drummer, who proceeded to talk my ear off on all the stuff the band planned to play. A week later, I showed up at the practice space at Kit’s home and sang a few songs for him. He didn’t say much but his ear-to-ear grin told me all I needed to know. Thus, I became the frontman for DNR.

DNR is composed of veteran musicians, many with a decade or more experience playing in bands. As for me, this is my very first band. At our early rehearsals, held almost every Saturday morning, I found myself being stared at by my bandmates, waiting for me to take charge and get us playing. It took me a few beats (ha!) to learn how to actually lead a band, but basically I faked it until I figured out what I was doing. I never considered before how cool and powerful it feels to set this band (or any band) in motion. It’s not something I pondered when I was singing solo to karaoke tracks!

So we rehearsed and rehearsed, picked an interesting setlist, and missed various practices here and there due to vacations, COVID cases, and what have you. Finally, after months of hard work rehearsing, we held our first gig over the Memorial Day weekend: a surprise birthday party for Chuck’s wife, Claudia. There were about two dozen people in attendance and friendly faces at that, but re-watching the video I took I appreciate more and more how heartfelt the applause is that we earned.

As we were returning from a break I noticed Kit, our guitarist, was staring at me and chuckling.

“What? What’d I do? Did I miss something?” I asked him in a panic.

“You’re a natural!” he laughed, still grinning.

It was a great compliment. I could tell he meant it, too.

Being a frontman is more than just singing. I have to introduce the song, create banter with the audience, play percussion, get the tempo right when starting a song, and often adjust the sound board for the best sound. The guys look to me for leadership, which still cracks me up as I don’t really know what I’m doing. I guess I’m good at playing it off, though, or maybe just not caring anymore whether I screw up or not.

It is a very easy band to be in as everyone is sharp and witty. We have some real musicians, and take on music from Steely Dan, The Allman Brothers, The Moody Blues, The Doobie Brothers, Dr. John, and many others. While we were setting up for the private party gig, I played some tunes I thought we might want to add to our repertoire and, to my delight, I heard John (our lead guitarist) and Lance (our keyboardist) start to learn to play them. Such talent!

The other interesting thing about the band is that many of the members are nearly twenty years older than me! David, our bassist, is the closest in age at three years older (he is 56 at this writing). Some have been playing for 40 years or more. Many of the songs we play were new when they were playing.

In addition to frontman and recording engineer duties, I took on the role of social media person, creating the band’s Facebook page, website, and other social media accounts. I’ve used my experience in video streaming to record (and hopefully steam) our performances. At nearly every practice, I’ve recording the audio (and usually video) for critiquing later. I have gigabytes of media now, so much that I’m starting to wonder where I’m going to put it all!

The only downside is that we don’t have regular gigs lined up yet. I am working to motivate the band to get this done so that we are not just playing for our own benefit but entertaining others. It was such a blast performing for people. It definitely brings a new energy to it all. I hope we can get some spots on local stages (preferably outside) so we can continue to generate buzz and grow our following.

I am loving the work I’m finally putting into music. It’s never too late to follow your dreams, huh?

Tarus Balog : Creating Strong Passwords

May 25, 2022 01:54 PM

For obvious reasons I’ve been creating some new passwords lately, and I wanted to share my method for creating strong passwords that are easy to remember yet hard to guess.

Of course, Randall Munroe set the bar with this comic:

xkcd Password Strength comic

It does make a lot of sense, but the method has its critics. Attackers can and do use random word generators which can break such passwords more quickly, even with, say, substituting “3” for “e”, etc.

There is also a good argument to be made that we should all be using password managers that generate long random passwords and not really creating passwords at all.

Then there is the very good idea of using two factor authentication, but that tends to augment passwords more than replace them.

In normal life you have to have at least a few passwords memorized, such as the one to get into your device and one to get into your password manager, so I thought I’d share my technique.

I like music, and I tend to listen to pretty obscure artists. What I do is to think of a random lyric from a song I like and then convert that into a password.

For example, right now I’m listening to the album Wet Tennis by Sofi Tukker. The track that gives me the biggest earworm is “Original Sin” which opens with the lyric:

So I think you’ve got
Something wrong with you
Something’s not right with me, too
It’s not right with me

If I were going to turn that into a password, I would come up with something like:

sItUgswwysnrwm,2inrwm

Looks pretty random, and contains lower case and upper case letters, a number and a special character. At 21 characters it isn’t quite as long as “correcthorsebatterystaple” but you can always add more words from the lyrics if needed.

Just thought I’d throw this out there as it works for me. The only thing I have to remember is not to hum the song while logging in.

Tarus Balog : The Adventure Continues

May 23, 2022 03:22 PM

Last year I wrote about parting ways with the OpenNMS Project and how I was ready for “Act III” of my professional career.

With my future being somewhat of a tabula rasa, I was a bit overwhelmed with choices, so I decided to return to my roots and dust off my consulting LLC. Soon I found myself in the financial sector helping to deploy network monitoring and observability solutions.

I was working with some pretty impressive applications and it was interesting to see the state of the art for monitoring. We’ve come a long way since SNMP. It was engaging and fun work, but all the software was proprietary and I missed the open source aspect.

Recently, Spot Callaway made me aware of an opportunity at Amazon Web Services for an open source evangelist position. Of all the things I’ve done in my career, acting as an evangelist for open source solutions was my favorite thing to do and here was a chance to do it full time. I will admit that Amazon was not the first name that popped into my head when I think “open source” but as I got to learn more about the team and AWS’s open source initiatives, the more interested I became in the position. After I made it through their rather intense interview process and met even more people with whom I’ll be working, it became a job I couldn’t refuse.

So I’m happy to announce that I’m now a Principal Evangelist at AWS, reporting to David Nalley, who, in addition to being a pretty awesome boss is also the current President of the Apache Software Foundation. OpenNMS would not have existed without software from the ASF, and it will be cool to learn, in addition, more about that organization first hand.

My main role will be to work with open source companies as an advocate for them within AWS. The solutions AWS provides can help jumpstart these companies toward profitability by providing the resources they need to be successful and to affordably grow as their needs change. While I am just getting started within the organization and it will take me some time to learn the ropes, I am hoping my own experience in running an open source business will provide a unique insight into issues faced by those companies.

Exciting times, so watch this space as my open source adventures continue.

Tarus Balog : “Run-of-the-Mill Person”

May 09, 2022 12:19 PM

I just noticed that my Wikipedia page has been deleted (the old version is still on the Internet Archive).

I’m not sure how I feel about this. When I was first made aware of its existence oh so many years ago I was both flattered and a little embarrassed, mainly because I didn’t think I rated a page on Wikipedia. But then I got to thinking that, hey, pretty much anyone should be able to have a page on Wikipedia as long as it adheres to their format guidelines. It’s not like it takes up much space, and as long as the person is verifiable as being a real person, why not?

I am certain I would have been okay with my page being deleted soon after it was created, but once you get used to having something, earned or not, there is a strong psychological reaction to having it taken away. From what I can tell the page was created in 2010, so it had been around for nearly 12 years with no one complaining.

The most hurtful thing was a comment about the deletion from EdwardX from London:

Nothing cited in the article counts towards WP:GNG, and I can find nothing better online. Run-of-the-mill person.

Really? Was the “Run-of-the-mill person” comment really necessary? (grin)

I’m still happy about what I was able to accomplish with OpenNMS and building the community around it, even if it was run-of-the-mill, and I plan to promote open source and open source companies for the remainder of my career, even if that isn’t Wikipedia-worthy.

Warren Myers : on using nmap to help find tlstorm-affected devices

March 11, 2022 06:20 PM

You may have heard of the recently-discovered/-published TLStorm vulnerability that affects – at least – APC SmartUPS devices.

One of the prime issues highlighted is the embedded nanoSSL library that APC has used on these systems.

If you want to find out if your system is affected, the following nmap except should start you towards a solution:

for octet in {30..39}; do (nmap -A -T4 192.168.0.$octet > nmap-192.168.0.$octet.out &) ; done

This will kick-off nmap to run in parallel. When they all finish (you can monitor how many are running using ps aux | grep nmap), you can then process the files rapidly thusly:

grep -i nano nmap*.out

If nanoSSL has been found, you’ll get a listing of all IPs running it (since you cleverly named your files with the IP in the name).

The mitigations you choose to implement have been explained well in the articles linked above, but finding these systems can be a pain.

Hope this helps someone 🙂

Tarus Balog : Nineteen Years

February 19, 2022 02:42 PM

Nineteen years ago my friend Ben talked me into starting this blog. I don’t update it as frequently any more for a variety of reasons, specifically because more people interact on social media these days and I’m not as involved in open source as I used to be, but it is still somewhat of an achievement to keep something going this long.

My “adventures” in open source started out on September 10th, 2001, when I started a new job with a company called Oculan to work on their open source monitoring platform OpenNMS. In May of 2002 I became the lead maintainer on the project, and by the time I started this blog I’d been at it for several months. Back then blogs were one of the main ways an open source project could communicate with its community.

The nearly two decades I spent with OpenNMS were definitely an adventure, and this site can serve as a record of both those successes and those struggles.

Nineteen years ago open source was very different than it is today. Today it is ubiquitous: I think it would be rare for a person to go a single day without interacting with open source software in some fashion. But back then there was still a lot of fear, uncertainty and doubt about using it, with a lot of confusion about what it meant. Most people didn’t take it seriously, often comparing it to “shareware” and never believing that it would ever be used for doing “real” things. On a side note, even in 2022 I recently had one person make the shareware comparison when I brought up Grafana, a project that has secured nearly US$300 million in funding.

Back then we were trying to figure out a business model for open source, and I think in many ways we still are. The main model was support and services.

You would have thought this would have been more successful than it turned out to be. Proprietary software costing hundred of thousands if not millions of dollars would often require that you purchase a maintenance or support contract running anywhere from 15% to 25% of the original software cost per year just to get updates and bug fixes. You would think that people would be willing to pay that amount or less for similar software, avoiding the huge upfront purchase, but that wasn’t the case. If they didn’t have to buy support they usually wouldn’t. Plus, support doesn’t easily scale. It is hard finding qualified people to support complex software. I’d often laugh when someone would contact me offering to double our sales because we wouldn’t have been able to handle the extra business.

One company, Red Hat, was able to pull it off and create a set of open source products people were willing to purchase at a scale that made them a multi-billion dollar organization, but I can’t think of another that was able to duplicate that success.

Luckily, the idea of “hosted” software gained popularity. One of my favorite open source projects is WordPress. You are reading this on a WordPress site, and the install was pretty easy. They talk about a “five minute” install and have done a lot to make the process simple.

However, if you aren’t up to running your own server, it might as well be a five year install process. Instead, you can go to “wordpress.com” and get a free website hosted by them and paid for by ads being shown on your site, or you can remove those ads for as little as US$4/month. One of the reasons that Grafana has been able to raise such large sums is that they, too, offer a hosted version. People are willing to pay for ease of use.

But by far the overwhelming use of open source today is as a development methodology, and the biggest open source projects tend to be those that enable other, often proprietary, applications. Two Sigma Ventures has an Open Source Index that tries to quantify the most popular open source projects, and at the moment these include Tensorflow (a machine learning framework), Kubernetes (a container orchestration platform) and of course the Linux kernel. What you don’t see are end user applications.

And that to me is a little sad. Two decades ago the terms “open source” and “free software” were often used interchangeably. After watching personal computers go from hobbyists to mainstream we also saw control of those systems move to large companies like Microsoft. The idea of free software, as in being able to take control of your technology, was extremely appealing. After watching companies spend hundreds of thousands of dollars on proprietary software and then being tied to those products, I was excited to bring an alternative that would put the power of that software back into the hands of the users. As my friend Jonathan put it, we were going to change the world.

The world did change, but not in the way we expected. The main reason is that free software really missed out on mobile computing. While desktop computers were open enough that independent software could be put on them, mobile handsets to this day are pretty locked down. While everyone points to Android as being open source, to be honest it isn’t very useful until you let Google run most of it. There was a time where almost every single piece of technology I used was open, including my phone, but I just ran out of time to keep up with it and I wanted something that just worked. Now I’m pretty firmly back into the Apple ecosystem and I’m amazed at what you can do with it, and I’m so used to just being able to get things going on the first try that I’m probably stuck forever (sigh).

I find it ironic that today’s biggest contributors to open source are also some of the biggest proprietary software companies in the world. Heck, even Red Hat is now completely owned by IBM. I’m not saying that this is necessarily a bad thing, look at all the open source software being created by nearly everyone, but it is a long way from the free software dream of twenty years ago. Even proprietary, enterprise software has started to leverage open APIs that at least give a nod to the idea of open source.

We won. Yay.

Recently some friends of mine attended a fancy, black-tie optional gala hosted by the Linux Foundation to celebrate the 30th anniversary of Linux. Most of them work for those large companies that heavily leverage open source. And while apparently a good time was had by all, I can’t help but think of, say, those developers who maintain projects like Log4j who, when there is a problem, get dumped on to fix it and probably never get invited to cool parties.

Open source is still looking for a business model. Heck, even making money providing hosted versions of your software is a risk if one of the big players decides to offer their version, as to this day it is still hard to compete with a Microsoft or an Amazon.

But this doesn’t mean I’ve given up on open source. Thanks to the Homebrew project I still use a lot of open source on my Macintosh. I’m writing this using WordPress on a server running Ubuntu through the Firefox browser. I still think there are adventures to be had, and when they happen I’ll write about them here.

Mark Turner : A positive COVID test

February 10, 2022 03:39 PM

Over the two-year course of this COVID-19 pandemic, I have taken extra steps to keep myself and my family safe. I’ve kept abreast of the latest medical advice and research. I’ve invested in N95 and KN95 masks. I’ve hauled around my HEPA air filter to places where proper ventilation would be hard to come by. Most importantly, whenever I’ve had the slightest concern that any health symptoms I’d been experiencing might have been COVID, I have gotten tested with Wake County’s free PCR COVID tests. Six times I’ve done this, and six times I received a relieving result of negative. Most recently, we were shipped a set of four COVID antigen tests free from the government, and a test using one of those turned up negative, too.

I kept my precautions up, thinking I had succeeded in avoiding an COVID infection. It turns out I may have been wrong and didn’t even know it.

Last week, I noticed that one of my right toes was a little stingy and looked bruised. I didn’t recall injuring it so I wondered if it might be the “COVID toes” I’d heard about. See, COVID patients reported sores on their toes (mainly. Fingers may be involved, too), and my toe looked suspiciously like this. COVID attacks the vascular system in addition to everything else it hits, and red toes can be a symptom. Around that time, I had an attack of my Reynaud’s Syndrome, with some of my fingers turning numb and white for over an hour. This red toe effect could also be caused by Reynaud’s (which is also a vascular disease), so I couldn’t say for sure what was what. Thus, I popped open the antigen test and 15 minutes later it told me I was COVID negative. Sure, an antigen test is not as accurate as a PCR test but this was at the height of my symptoms so I assumed if I was going to pop positive on anything it would be right at that moment. But, no, it was negative!

Over the weekend, I got to thinking about how my body reacted to the primary, secondary, and booster COVID vaccines I had gotten. Basically, I didn’t react at all! There were no noticeable side-effects whatsoever. I was thinking about this and deciding that perhaps my reaction to the actual virus would be a similar non-event. I decided to contact the VA to schedule a COVID antibody test, knowing that this might show whether I’d been exposed and didn’t know it.

I got the blood drawn this past Monday morning. The result came back the next day and, like I had started to suspect, it was positive. My body has SARS-COV-2 antibodies.

Now, experts caution that this does not necessarily mean I had been exposed to SARS-COV-2 (COVID-19), only that my body knows how to fight it. It could be that sometime in the past I’d been exposed to a similar coronavirus. However, I think it’s unlikely that it was anything other than one variant of COVID-19 or another. Most likely the omicron variant, the highly virulent one responsible for more than 95% of current infections.

Just knowing I have antibodies, though, is a huge weight off of my shoulders. And if I was infected, the odds are high that my wife, son, and possibly my daughter have also had it and didn’t know it. My son Travis has taken twice as many tests as I have and had them all come back negative. He was astonished to test negative one time after eating lunch in a closed car with his school buddies, many of whom tested positive. To me, that seems like evidence that Travis had already seen COVID and was immune. AT the time he credited his vaccines but there may be more to it than that. I think he feels better now, knowing that he, also, might have antibodies, though we still need to confirm this with a test.

I suppose if I had to get a positive COVID test result, this is the one to get. I’m glad I continued to protect folks outside of my bubble (and I will continue to do so), but with a teen in the home and one dropping in on a regular basis, it was inevitable that it was going to pass through at some point. Crazy that I never even knew it.

Tarus Balog : Nextcloud News

February 08, 2022 04:19 PM

I think the title of this post is a little misleading, as I don’t have any news about Nextcloud. Instead I want to talk about the News App on the Nextcloud platform, and I couldn’t think of a better one.

I rely heavily on the Nextcloud News App to keep up with what is going on with the world. News provides similar functionality to the now defunct Google Reader, but with the usual privacy bonuses you expect from Nextcloud.

Back before social networks like Facebook and Twitter were the norm, people used to communicate through blogs. Blogs provide similar functionality: people can write short or long form posts that will get published on a website and can include media such as pictures, and other people can comment and share them. Even now when I see an incredibly long thread on Twitter I just wish the author would have put it on a blog somewhere.

Blogs are great, since each one can be individually hosted without requiring a central authority to manage it all. My friend Ben got me started on my first blog (this one) that in the beginning was hosted using a program called Moveable Type. When their licensing became problematic, most of us switched to WordPress, and a tremendous amount of the Web runs on WordPress even now.

Now the problem was that the frequency that people would post to their blogs varied. Some might post once a week, and others several times an hour. Unless you wanted to go and manually refresh their pages, it was difficult to keep up.

Enter Really Simple Syndication (RSS).

RSS is, as the name implies, an easy way to summarize content on a website. Sites that support RSS craft a generic XML document that reflects titles, descriptions, links, etc. to content on the site. The page is referred to as a “feed” and RSS “readers” can aggregate the various feeds together so that a person can follow the changes on websites that interest them.

Google Reader was a very useful feed reader that was extremely popular, and it in turn increased the popularity of blogs. I put some of the blame on Google for the rise of the privacy nightmare of modern social networks on their decision to kill Reader, as it made individual blogs less relevant.

Now in Google’s defense they would say just use some other service. In my case I switched to Feedly, an adequate Reader replacement. The process was made easier by the fact that most feed readers support a way to export your configuration in the Outline Processor Markup Language (OPML) format. I was able to export my Reader feeds and import them into Feedly.

Feedly was free, and as they say if you aren’t paying for the product you are the product. I noticed that next to my various feed articles Feedly would display a count, which I assume reflected the number of Feedly users that were interested in or who had read that article. Then it dawned on me that Feedly could gather useful information on what people were interested in, just like Facebook, and I also assume, if they chose, they could monetize that information. Since I had a Feedly account to manage my feeds, they could track my individual interests as well.

While Feedly never gave me any reason to assign nefarious intentions to them, as a privacy advocate I wanted more control over sharing my interests, so I looked for a solution. As a Nextcloud fan I looked for an appropriate app, and found one in News.

News has been around pretty much since Nextcloud started, but I rarely hear anyone talking about its greatness (hence this post). Like most things Nextcloud it is simple to install. If you are an admin, just click on your icon in the upper right corner and select “+ Apps”. Then click on “Featured apps” in the sidebar and you should be able to enable the “News” app.

That’s it. Now in order to update your feeds you need to be using the System Cron in Nextcloud, and instructions can be found in the documentation.

Once you have News installed, the next challenge is to find interesting feeds to which you can subscribe. The news app will suggest several, but you can also find more on your own.

Nextcloud RSS Suggestions

It used to be pretty easy to find the feed URL. You would just look for the RSS icon and click on it for the link:

RSS Icon

But, again, when Reader died so did a lot of the interest in RSS and finding feed URLs more became difficult. I have links to feeds at the very bottom of the right sidebar of this blog, but you’d have to scroll down quite a way to find them.

But for WordPress sites, like this one, you just add “/feed” to the site URL, such as:

https://www.adventuresinoss.com/feed

There are also some browser plugins that are supposed to help identify RRS feed links, but I haven’t used any. You can also “view source” on a website of interest and search for “rss” and that may help out as well.

My main use of the News App is to keep up with news, and I follow four main news sites. I like the BBC for an international take on news, CNN for a domestic take, Slashdot for tech news and WRAL for local news.

Desktop Version of News App

Just for reference, the feed links are:

BBC: http://newsrss.bbc.co.uk/rss/newsonline_uk_edition/front_page/rss.xml

CNN: http://rss.cnn.com/rss/cnn_topstories.rss

Slashdot: http://rss.slashdot.org/slashdot/slashdotMain

WRAL: http://www.wral.com/news/rss/48/

This wouldn’t be as useful if you couldn’t access it on a mobile device. Of course, you can access it via a web browser, but there exist a number of phone apps for accessing your feeds in a native app.

Now to my knowledge Nextcloud the company doesn’t produce a News mobile app, so the available apps are provided by third parties. I put all of my personal information into Nextcloud, and since I’m paranoid I didn’t want to put my access credentials into those apps but I wanted the convenience of being able to read news anywhere I had a network connection. So I created a special “news” user just for News. You probably don’t need to do that but I wanted to plant the suggestion for those who think about such things.

On my iPhone I’ve been happy with CloudNews.

iPhone Version of CloudNews App

It sometimes gets out of sync and I end up having to read everything in the browser and re-sync in CloudNews, but for the most part it’s fine.

For Android the best app I’ve used is by David Luhmer. It’s available for a small fee in the Play Store and for free on F-Droid.

Like all useful software, you don’t realize how much you depend on it until it is gone, and in the few instances I’ve had problems with News I get very anxious as I don’t know what’s going on in the world. Luckily this has been rare, and I check my news feed many times during the day to the point that I probably have a personal problem. The mobile apps mean I can read news when I’m in line at the grocery store or waiting for an appointment. And the best part is that I know my interests are kept private as I control the data.

If you are interested, I sporadically update a number of blogs, and I aggregate them here. In a somewhat ironic twist, I can’t find a feed link for the “planet” page, so you’d need to add the individual blog feeds to your reader.

Tarus Balog : Review: AT&T Cell Booster

January 31, 2022 07:40 PM

Back in the mid-2000s I was a huge Apple fanboy, and I really, really, really wanted an iPhone. At that time it was only available from AT&T, and unfortunately the wireless coverage on that network is not very good where I live.

In 2008 a couple of things happened. Apple introduced the iPhone 3G, and AT&T introduced the 3G Microcell.

The 3G Microcell, technically a “femtocell“, is a small device that you can plug into your home network and it will leverage your Internet connection to augment wireless coverage in a small area (i.e. your house). With that I could get an iPhone and it would work at my house.

In February 3G service in the US will cease, and I thought I was going to have to do without a femtocell. Most modern phones support calling over WiFi now, but it just isn’t the same. For example, if I am trying to send an SMS and there is any signal at all from AT&T, my phone will try to use that network instead of the much stronger wireless network in my house. If I disable mobile access altogether, the SMS will send fine but then I can’t get phone calls reliably. (sigh)

I thought I was going to have to just deal with it when AT&T sent me a notice that they were going to replace my 3G Microcell with a new product called a Cell Booster.

Now a lot of people criticize AT&T for a number of good reasons, but lately they’ve really been hitting the whole “customer service” thing out of the park. The Cell Booster currently shows out of stock on their website with a cost of $229, but they sent me one for free.

AT&T Cell Booster Box

In a related story my mother-in-law, who is on our family plan, was using an older Pixel that was going to stop working with the end of 3G service (it was an LTE phone but doesn’t support “HD Voice” which is required to make calls). So AT&T send us a replacement Samsung S9. Pretty cool.

In any case the Cell Booster installation went pretty smoothly. I simply unplugged the existing 3G Microcell and plugged in the new device. The box included the Cell Booster, a GPS sensor, a power supply and an Ethernet cable. No other instructions outside of a QR code which will take you to the appropriate app store to download the necessary application to set it up.

The Booster requires a GPS lock, and they include a little “puck” connected to a fairly long wire that is supposed to allow one to get a signal even when the device is some distance away from a clear line of sight, such as away from windows. I just plugged it in to the back and left it next to the unit and it eventually got a signal, but it is also pretty much beneath a skylight.

In order to provision the Cell Booster you have to launch the mobile app and fill out a few pages of forms, which includes the serial number of the device. It has five lights on the front and while the power light came on immediately, it did take some time for the other lights, including “Internet” to come up. I assumed the Internet light would have turned on as soon as an IP address was assigned, but that wasn’t the case. It took nearly a half and hour for the first four lights to come on, and then another 15 minutes or so for the final “4G LTE” light to illuminate and the unit to start working. Almost immediately I got an SMS from AT&T saying the unit was active.

AT&T Cell Booster Lights

Speaking of IP addresses, I don’t like putting random devices on my LAN so I stuck this on my public network which only has Internet access (no LAN access). I ran nmap against it and there don’t appear to be any ports open. A traffic capture shows traffic between the Cell Booster and a 12.0.0.0 network address owned by AT&T.

I do like the fact that, unlike the 3G Microcell, you do not need to specify the phone number of the handsets that can use the Cell Booster. It claims to support up to 8 at a time, and while I haven’t had anyone over who is both on the AT&T network and also not on my plan, I’m assuming it will work for them as well (I used to have to manually add phone numbers of my guests to allow them to use the 3G device).

The Cell Booster is a rebranded Nokia SS2FII. One could probably buy one outside of AT&T but without being able to provision it I doubt it would work.

So far we’ve been real happy with the Cell Booster. Calls and SMS messages work just fine, if not better than before (I have no objective way to measure it, though, so it might just be bias). If you get one, just remember that it takes a really long time to start up that first time, but after you have all five lights you should be able to forget it’s there.

Tarus Balog : Review: ProtonMail

January 26, 2022 12:55 PM

I love e-mail. I know for many it is a bane, which has resulted in the rise of “inbox zero” and even the “#noemail” movement, but for me it is a great way to communicate.

I just went and looked, and the oldest e-mail currently in my system is from July of 1996. I used e-mail for over a decade before then, on school Unix systems and on BBS’s, but it wasn’t until the rise of IMAP in the 1990s that I was able to easily keep and move my messages from provider to provider.

That message from 1996 was off of my employer’s system. I didn’t have my own domain until two years later, in 1998, and I believe my friend Ben was the one to host my e-mail at the time.

When I started maintaining OpenNMS in 2002 I had a server at Rackspace that I was able to configure for mail. I believe the SMTP server was postfix but I can’t remember what the IMAP server was. I want to say it was dovecot but that really wasn’t available until later in 2002, so maybe UW IMAP? Cyrus was pretty big at the time but renown for being difficult to set up.

In any case I was always a little concerned about the security of my mail messages. Back then disks were not encrypted and even the mail transport was done in the clear (this was before SSL became ubiquitous), so when OpenNMS grew to the point where we had our own server room, I set up a server for “vanity domains” that anyone in the company could use to host their e-mail and websites, etc. At least I knew the disks were behind a locked door, and now that Ben worked with us he could continue to maintain the mail server, too. (grin)

Back then I tried to get my friends to use encrypted e-mail. Pretty Good Privacy (PGP) was available since the early 1990s, and MIT used to host plugins for Outlook, which at the time was the default e-mail client for most people. But many of them, including the technically minded, didn’t want to be bothered with setting up keys, etc. It wasn’t until later when open source really took off and mail clients like Thunderbird arrived (with the Enigmail plug-in) that encrypted e-mail became more common among my friends.

In 2019 the decision was made to sell the OpenNMS Group, and since I would no longer have control over the company (and its assets) I decided I needed to move my personal domains somewhere else. I really didn’t relish the idea of running my own mail server. Spam management was always a problem, and there were a number of new protocols to help secure e-mail that were kind of a pain to set up.

The default mail hosting option for most people is GMail. Now part of Google Workspace, for a nominal fee you can have Google host your mail, and get some added services as well.

I wasn’t happy with the thought of Google having access to my e-mail, so I looked for options. To me the best one was ProtonMail.

The servers for ProtonMail are hosted in Switzerland, a neutral country not beholden to either US or EU laws. They are privacy focused, with everything stored encrypted at rest and, when possible, encrypted in transport.

They have a free tier option that I used to try out the system. Now, as an “old”, I prefer desktop mail clients. I find them easiest to use and I can also bring all of my mail into one location, and I can move messages from one provider to another. The default way to access ProtonMail is through a web client, like GMail. Unlike GMail, ProtonMail doesn’t offer a way to directly access their services through SMTP or IMAP. Instead you have to install a piece of software called the ProtonMail Bridge that will create an encrypted tunnel between your desktop computer and their servers. You can then configure your desktop mail client to connect to “localhost” on a particular port and it will act as if it were directly connected to the remote mail server.

In my trial there were two shortcomings that immediately impacted me. As a mail power user, I use a lot of nested folders. ProtonMail does not allow you to nest folders. Second, I share some accounts with my spouse (i.e. we have a single Paypal account) and previously I was able to alias e-mail addresses to send to both of our user accounts. ProtonMail does not allow this.

For the latter I think it has to do with the fact that each mail address requires a separate key and their system must not make it easy to use two keys or to share a key. I’m not sure what the issue is with nested folders.

In any case, this wasn’t a huge deal. To overcome the nested folder issue I just added a prefix, i.e. “CORR” for “Correspondence” and “VND” for “Vendor”, to each mailbox, and then you can sort on name. And while we share a few accounts we don’t use them enough that we couldn’t just assign it to a particular user.


UPDATE: It turns out it is now possible to have nested folders, although it doesn’t quite work the way I would expect.

Say I want a folder called “Correspondence” and I want sub-folders for each of the people with whom I exchange e-mail. I tried the following:

So I have a folder named something like “CORR-Bill Gates”, but I’d rather have that nested under a folder entitled “Correspondence”. In my desktop mail client, if I create a folder called “Correspondence” and then drag the “CORR-Bill Gates” folder into it, I get a new folder titled “Correspondence/CORR-Bill Gates” which is not what I want.

However, I can log into the ProtonMail webUI and next to folders there is a little “+” sign.

Add Folder Menu Item
If I click on that I get a dialog that lets me add new folders, as well as to add them to a parent folder.

Add Folder Dialog Box

If I create a “Correspondence” folder with no parent via the webUI and then a “Bill Gates” folder, I can parent the “Bill Gates” folder to “Correspondence” and then the folders will show up and behave as I expect in my desktop e-mail client. Note that you can only nest two levels deep. In other words if I wanted a folder structure like:

Bills -> Taxes -> Federal -> 2021

It would fail to create, but

Bills -> Taxes -> 2021-Federal

will work.


After I was satisfied with ProtonMail, I ended up buying the “Visionary” package. I pay for it in two-year chunks and it runs US$20/month. This gives me ten domains and six users, with up to 50 e-mail aliases.

Domain set up was a breeze. Assuming you have access to your domain registrar (I’m a big fan of Namecheap) all you need to do is follow the little “wizard” that will step you through the DNS entries you need to make to point your domain to ProtonMail’s servers as well as to configure SPF, DKIM and DMARC. Allowing for the DNS to update, it can be done in a few minutes or it may take up to an hour.

I thought there would be a big issue with the 50 alias limit, as I set up separate e-mails for every vendor I use. But it turns out that you only need to have a alias if you want to send e-mail from that address. You can set up a “catch all” address that will take any incoming e-mail that doesn’t expressly match an alias and send it to a particular user. In my case I set up a specific “catchall@” address but it is not required.

You can also set up filters pretty easily. Here is an example of sending all e-mail sent to my “catchall” address to the “Catch All” folder.

require ["include", "environment", "variables", "relational", "comparator-i;ascii-numeric", "spamtest"];
require ["fileinto", "imap4flags"];

# Generated: Do not run this script on spam messages
if allof (environment :matches "vnd.proton.spam-threshold" "*", spamtest :value "ge" :comparator "i;ascii-numeric" "${1}") {
return;
}


/**
* @type and
* @comparator matches
*/
if allof (address :all :comparator "i;unicode-casemap" :matches ["Delivered-To"] "catchall@example.com") {
fileinto "Catch All";
}

I haven’t had the need to do anything more complicated but there are a number of examples you can build on. I had a vendor that kept sending me e-mail even though I had unsubscribed so I set up this filter:

require "reject";


if anyof (address :all :comparator "i;unicode-casemap" :is "From" ["noreply@petproconnect.com"]) {
reject "Please Delete My Account";
}

and, voilà, no more e-mail. I’ve also been happy with the ProtonMail spam detection. While it isn’t perfect it works well enough that I don’t have to deal with spam on a daily basis.

I’m up to five users and eight domains, so the Visionary plan is getting a little resource constrained, but I don’t see myself needing much more in the near future. Since I send a lot of e-mail to those other four users, I love the fact that our correspondence is automatically encrypted since all of the traffic stays on the ProtonMail servers.

As an added bonus, much of the ProtonMail software, including the iOS and Android clients, are available as open source.

While I’m very satisfied with ProtonMail, there have been a couple of negatives. As a high profile pro-privacy service it has been the target of a number of DDOS attacks. I have never experienced this problem but as these kinds of attacks get more sophisticated and more powerful, it is always a possibility. Proton has done a great job at mitigating possible impact and the last big attack was back in 2018.

Another issue is that since ProtonMail is in Switzerland, they are not above Swiss law. In a high profile case a French dissident who used ProtonMail was able to be tracked down via their IP address. Under Swiss law a service provider can be compelled to turn over such information if certain conditions are met. In order to make this more difficult, my ProtonMail subscription includes access to ProtonVPN, an easy to use VPN client that can be used to obfuscate a source IP, even from Proton.

They are also launching a number of services to better compete with GSuite, such as a calendar and ProtonDrive storage. I haven’t started using those yet but I may in the future.

In summary, if you are either tired of hosting your own mail or desire a more secure e-mail solution, I can recommend ProtonMail. I’ve been using it for a little over two years and expect to be using it for years to come.

Warren Myers : the first carafe in my ninja dualbrew pro cfp301

January 06, 2022 06:19 PM

A couple months ago I posted a first review of my Ninja Dualbrew Pro CFP301 K-Cup-compatible brewer.

This morning I made my first carafe of coffee with it.

First the pros:

  • very easy to brew a pot of coffee
  • it’s easy to pop the K-Cup holder out and swap to crafe mode
  • even without using the “keep warm” feature, coffee stays warm-to-hot in the carafe for a long time
  • cleanup is a breeze

Now for the con:

If you follow the directions for how much coffee to use to brew (1-2 tablespoons per 6oz of water) … it comes little “thick” (I used a little over 9 T of Folger’s – should have gone for about 1/5-1/4 less).

The strength of the brew was fine, but it’s not as smooth as I think coffee is supposed to be.

Warren Myers : the coop (with lots of in-progress pictures)

November 19, 2021 08:49 PM

As promised a few days ago, here’s the Big Writeup™ on our new coop

First, the pictures

That’s a lotta pictures! And I didn’t post them all ?

Some of the key features of the this coop:

  • 6’x8′ exterior floor dimensions
  • the floor’s covered in peel-n-stick vinyl tiles for easy cleaning
  • 12′ roof, which overhangs on the high and low sides by ~2′, and on the other two sides by ~1′
  • 4′ wide, 12′ long roof over the first part of the run
    • both roofs drop ~2′ over the 8′ of the width of the coop – making snow accumulation very unlikely
  • coop’s elevated about 30″ off the ground (makes for easy emptying of bedding material into wheel barrow)
  • pair of 6′ roosting bars
  • pair of 3-berth nesting boxes, with easy access from the outside (way better than the old coop, which mandated opening the door to get to them)
  • 8’x24′ fully enclosed-run
  • plenty of ventilation (including removable window covers on the run side of the coop for use during the colder months)
  • plenty of light – there’s a 2’x6′ skylight on the high side of the coop
  • cleated entry ramp for the chickens to get from the ramp to the coop
  • as close to predator-proof as is reasonable to do
    • hardware cloth over the rear access gates, all windows, and the run door
    • poulty netting around the entirety of the run, with a second stretch of welded-wire over the bottom 30″
    • poultry netting & cedar pickets enclosing two sides of the shaded region under the coop
    • poultry netting or pickets blocking open access to the roof rafters
  • mostly weather-proof location for feeder and waters (partially under the coop on the run side)
  • full-height run access door (was able to repurpose and reinforce an old screen door I had)
  • run anchored against sliding with 12″ rebars driven into the ground around the base

What improvements do know I have left?

  • add water collection system to capture runoff
    • this will also allow for [semi]automatic watering vs schlepping a couple gallons of water a day to the waterer
  • shedette on the back side of the coop (facing away from the house) for food, tool, etc storage

How long did it take?

Calendar time, start to finish was about 3.5 months

Work time, start to finish was about 10-12 days

How much did it cost?

…more than I wish – but less than it could have 😉

Seriously, though – it wasn’t horrible: well under $2000 total 🙂

Could probably have saved some more on cost if I hadn’t bought the coop frame materials in July 2021 … but – c’est la vie: it had to be done, so we done did it 🙂

What would I do differently if I knew then what I know now?

First, I wouldn’t have preframed the wall panels – precut all the materials, sure: but preframing the walls turned out to make it more difficult to assemble than I had hoped

Second, I’d’ve accounted for materials better, so I didn’t have to make quite as many trips to my local Lowes ?

Third, I’d’ve made it 8’x8′ so I’d’ve had less cutting of plywood to do 🙂

Fourth, I’d’ve placed the floor cover (whether peel-n-stick tiles, or linoleum, or something else entirely) before mounting the floor to the posts and adding the walls – would’ve been way simpler!

Should you build a coop more-or-less like this one?

I don’t know – he’s on third, and I don’t give a darn!

Whoops – out popped an old comedy routine quote 😛

If you’ve got the space and the inclination to build it, something like this on your property could be an absolute blast of a project to undertake! I had more fun than not getting it built and ready for the chickens

If you decide to build a coop like this one, let me know! I’d love to see how yours turns out!

If you’d like copies of the rough drawings I made of each part, I’d be happy to share those, too

Warren Myers : ninja dualbrew pro cfp301 review

November 04, 2021 07:58 PM

After close to a decade, my old Keurig brewer finally bit the dust last week 🙁

Given we can’t go long without needing hot water or got beverages in the family…it became “urgent” to replace it

We had been looking for a while, knowing a replacement was going to be necessary “soon” – so we already knew we wanted a brewer that would do both k-cups and brew regular coffee into a carafe

Keurig has a few that will do this, as do some other brands – but most of the reviews for them are…less than hearty praise

The two Ninja models we were considering, though, the CFP201 and CFP301 were very well rated

And for an added bonus – they were not more expensive than comparable Keurig models

However, I still wasn’t relishing the idea of paying well over $200 for a coffee maker :/

Thankfully, we didn’t have to

Kohl’s had a sale on Ninja appliances last week, and we has stackable coupons (percent and actual dollars off) plus Kohl’s rewards available (if you used them in store)…so to our local store I went (about 20 minutes before closing)

Picked up the Ninja DualBrew Pro CFP301 for almost 40% off their normal list price, and grabbed some rooibos stick tea, too, to try out

Been using it less than a week, but so far it’s fantastic

It has more brew size selections than the Keurig models, and will brew for specialty uses (over ice, etc) “natively”

It heats up water and starts brewing faster than any standalone brewer I’ve used (ie not connected to the wall for water)

There’s also a bypass control to get just got water – without it going through the brewpod adapter

My wife enjoyed me frothing her London fog latte’s milk a couple days ago, too 🙂

What complaints do we have [so far]?

Not many…and maybe they won’t be a big deal to you, but these are the drawbacks we’ve seen this far:

  • it’s really big – it’s probably got a 10% larger footprint than our old (and large) Keurig
  • if you choose to move the reservoir to the back, it juts way too far from the wall – and will not sit on the pod drawer we have
  • switching between operation modes is quick…but gosh! There are a lot of options!

Haven’t had a chance to try carafe brewing yet, but I can still give this brewer a very solid ??

Mark Turner : Hello and Goodbye to Google Fiber

October 30, 2021 12:00 PM


As y’all may know, I’ve been a booster of Google Fiber for a while. I signed us up for it the first day it became available. This week, I switched us back to AT&T. Let me explain why.

The server that hosts this website, my neighborhood mailing list, and other Internet stuff lives in a datacenter in Atlanta. I don’t really notice this, though, because the AT&T Fiber’s routing is excellent! I get super-low latency of 16 ms for my round-trip pings. I can’t reach many cross-town servers much faster than that. When I switched us over to Google Fiber, that round-trip time jumped to 60-100 ms. I researched whether my hosting provider’s datacenters in other cities were any better but it turns out Google Fiber is not nearly as good as AT&T’s. The city with the fastest server Google Fiber could get me to was Dallas which – as you geography buffs will note – is significantly farther from Raleigh than Atlanta. Go figure.

Please note that I’m a network nerd and my tech needs are, um, … unique. Normal people would probably not notice this stuff.

Being temporarily “dual-homed” with Google and AT&T meant I could negotiate rates. When I called to cancel AT&T, they offered me my same package at 30% off for 1 year (i.e., cheaper than Google and I can renew the deal next year). We now get for $60 what before we got for $90. Praise competition!

Google Fiber is still connected to our house (their fiber is still “lit.”) We’re not locked into AT&T with any contract so if AT&T pisses me off we can switch back without any trouble. Google just wants their WiFi Access Points back, which I didn’t use anyway.

There is also part of me that feels that a little bait-and-switch took place with Google Fiber. When Google Fiber was announced, I was under the impression that Google would devote its massive resources to making it a success. Instead, the company changed focus almost immediately, drastically putting on the brakes to its deployments. It was clear Google was not willing to make the investments necessary to make Google Fiber a healthy concern for the next fifty years. Google’s obsessively focused on its short-term stock market performance. It does not make investments the way railroads do, or like providers that expect to be relevant in 50 years, like AT&T.

Google Fiber switched to micro-trenching for its network installations. It also outsourced its installs to companies like Prime Telecom. I had multiple crews try to put in fiber, only to have me interrupt their installations because they were either bringing the fiber to the wrong side of our house or they were digging without doing utility locating. In hindsight, I suppose they usually skip the locating because it’s time-consuming and their shallow trenches rarely affect other buried utilities. They’d rather take the chance of busting something else than wait for lines to be marked. I don’t think this is a very professional game plan, personally.

Google Fiber does offer something unique, and that’s 2 Gbps service, twice as fast as our current service. This would be appealing to me but it is asymmetrical and the upload speed is still limited to 1 Gbps. I’d also have to upgrade all of our home networking gear to the new 2.5 Gbps standard. Well, technically I could use Google’s Wifi6 Access Points to go 2 Gbps but I want to use all the copper I’ve put into our house, rather than rely on WiFi. So, until Google makes the 2 Gbps service symmetrical I’ll stick with single-gigabit speeds.

All that being said, gigabit internet rocks! Saying goodbye to Spectrum forever rocks! Competition rocks! If you can get gigabit fiber, either through Google or AT&T, I recommend you do it. You will be happy you did!

Warren Myers : the new coop

October 29, 2021 05:18 PM

It’s been a long time coming

But the new chicken coop is done

First, let’s rewind the clock to late 2016

We had just moved to our “farm” to be closer to family out of the “big city” (not a farm, and not a big city … but you get the idea)

My father-in-law had some spare hens, so we built a simple pallet coop on a basic frame (some 2by pressure treated runners and a sheet of 3/4 plywood on top for the floor – a couple recovered/reused metal roof panels for the lid), and started our chicken-raising journey

It was great interim coop – and could have been a great long-term coop … if we’d made it double the size

But we only planned to have 3-5 chickens at any given time, so it was good enough…until we decided we wanted more

While it could “handle” 7 or 8, it was tight

During the initial weeks of the pandemic in 2019, I made some improvements to the old coop while we planned a new one – added a window, redid the run door, redid the coop door…basic stuff – maybe a $100 in materials all told

But it wasn’t going to handle more than ~6 chickens for any extended period of time, and it needed to be moved and/or have the run greatly expanded to really manage the flock well

Enter planning for a new coop

Oh

And watching lumber prices go through the roof 😐

While we waited for prices to at least start to come down, we reviewed scores of shapes and ideas – finally settling on a mild variation of a couple that kept popping-up when we’d look

First up was that it be raised off the ground so the chickens would have a shaded and rain-free area to congregate outside their coop, and a shaded and rain-free area for their food and water to be

Second was to ensure it could handle as many as 25 chickens without too many issues

Third was ensuring the run was bigger than the old one, and tall enough to stand under (the old run is only about 5′ at its highest point – making it impossible to stand under if you’re not a young kid)

Fourth was ensuring the new coop could be easily cleaned-out

Fifth was making sure there is more than one way to get into the run if the need arises

Sixth was ensuring the new coop would be well ventilated, and give the chickens substantially more light inside than the old one has

Ultimately, this led to a 6’x8′ coop with nesting boxes in the walls (so they’re not taking-up floor space), an 8’x24′ run plus the under-coop area (an additional 8’x6′ region), a 6’x2′ wall-width skylight, and ample well-screened ventilation windows

My next post will share in-progress photos, an approximate materials list, and ideas on what I’d do differently if I knew then what I know now

Warren Myers : pan-fried hamburgers

October 28, 2021 03:14 PM

Made some hamburgers the other night, and they came out better than expected – always a plus ?

  • 1 lb 85% lean ground beef
  • 2 large eggs
  • 1/8c quick oats
  • Kinder’s buttery steakhouse rub
  • <1/8c Italian bread crumbs

Mix together like you’re going to make meatloaf

Divvy into 8 ~2oz balls – pack hand-tight

Preheat frying pan on medium-low heat (~3.5 on my stovetop)

Press meatballs slightly flat in pan

When they look done enough to flip, flip and press flat with your spatula

Remove from heat – adding cheese if desired – and serve when they’re done to your liking

Each hamburger will have ~12g protein, ~11g fat, and ~1g carbs

Jesse Morgan : Earbud Comparison

October 09, 2021 01:37 AM

ok, it’s been a while since I’ve posted. I’ve mainly been waiting on migrating off wordpress to Hugo, but that hasn’t happened and I need to collect my thoughts.

If you know me, you know I can be… Picky. I over-obsess about decisions (I have a spreadsheet with 17 apple varieties after finding out that Red Delicious were in fact the least delicious) to the point of absurdity. One of the things I’m most picky about are headphones.

I should point out that this is not a comprehensive list of all of the headphones I’ve owned- I’ve gone through over a dozen headphones over the years, but there are few that meet the “workhorse” requirement. I’m also sticking mainly to headphones I wear when I’m on the move.

Why I’m Picky

I’d started writing about why I preferred earhooks, but realized I needed to step back and explain my situation. My ear canals have a miniscule difference in diameter, so the standard in-ear single-flange eartips never fit consistently; either my left ear hurts or my right earphone keeps falling out. Even foam eartips feel like they’re slowly preparing to pop out.

In addition, I suffer from a broken hyper-awareness that makes it difficult to focus when there’s a lot of noises or crosstalk. If I can hear people speaking, I can’t listen to a podcast.

I also sweat very easily, so over the-the-ear headphones with any exertion cause me to sweat, resulting in slipping and stink. Fine for desk work, but terrible for walks in the sun or yardwork.

First Love- Earhooks

For the longest time, I used wired Phillips Earhooks. There are two main reasons-

  1. They didn’t violate my ears like the normal rubber-tipped earbuds that are common today, and
  2. They wouldn’t fall off if I tipped my head.

After these became hard to find, I switched to Skullcandy, which I still occasionally use with my laptop.

The Day the 3.5mm Died

When Apple decided to get rid of 3.5mm and force bluetooth, I ignored it- was an android user and didn’t think google would follow their footsteps. Until they did. Even then, it wasn’t a problem for me until I had a friend give me a deal on a used pixel 2. While I had used bluetooth headphones before, I knew that the conventional earbuds with an earhook weren’t an option.

I took a gamble on the Anker Soundcore Spirit X and found they were actually pretty decent. My only complaint was battery life. The earhooks helped reduce the discomfort of the eartips, but I could only wear them for so long.

For whatever reason, I ended up giving those to my son (who quickly destroyed them), while I switched to the Monoprice ANC headphones I had been using for work. The ANC was great for mowing, but they’d be soggy by the time I was done. Unfortunately the cheap band cracked, and tape could only hold it together for so long.

In Dec. 2020 I ordered a pair of Wyze Headphones as part of their beta program since I’ve had so much luck with their home automation stuff. These had even better ANC (Active nose cancelling) as well as bluetooth 5.1, meaning I could sync to both my phone and my car without having to do the disconnect/reconnect dance.

Unfortunately, one side developed a crack and I had to RMA them. The second pair developed a crack in the same place. In addition, using them while mowing has resulted in an absolutely atrocious stink that won’t come off. This time, I decided to do some research.

Here Comes the Spreadsheet

I went through amazon and google looking for any and all earbuds that I thought wouldn’t suck. My needs were relatively straightforward:

  • Reasonable price (under $100)
  • Bluetooth 5.1
  • 9+ hour charge
  • water resistant
  • a microphone
  • noise cancellation

I also looked at ratings, reviews, total charge (if it came with a charging case), etc. I ended up with 55 candidates:

https://docs.google.com/spreadsheets/d/15FWw8eRS_sI7zZUDcU1LvmqMcg7jk_uV2iBhmB3c5qU/edit?usp=sharing

Long story short, I settled on XLeader SoundAngel Sport3 Pro, which appeared to be the best in class for what I was looking for: BT 5.1, 12 hr charge, IPX7, USB-C, ANC CVC8.0, and 1000+ ratings with a 4.1 average.

Too bad they hurt.

Betrayed by My Ears Again!

It turns out that in addition to missized ear canals, my antitragus (see left), is in the way of the Soundangel Sport3 pro. If you look closely at the picture of the XLeader above, you’ll see a… lump?Burl? lovehandle? -sticking out to the left of the eartip. That little lump is hard plastic, and presses against my antitragus so hard that I couldn’t wear them for more than an hour.

After hours of gathering information, I couldn’t wear them.

Meanwhile, my wife decided to pick up a pair of inducer headphones that work great (though they wouldn’t help with noise cancellation). I’ve decided to go back to the drawing board and cast a wider net.

References

The post Earbud Comparison first appeared on Morgajel.net.

Warren Myers : a-frame coopettes for raising chicks

September 17, 2021 02:15 PM

We raise chickens.

For the last few years, we’ve only had layers – and they’ve all been full-grown by the time they arrived at our home.

This year, we decided to buy some chicks because our layers are starting to age-out of being able to lay, and we’re interested in trying our hand at raising a few birds for butchering ourselves.

Since you need to wait to add new birds to your flock until the birds are 6+ weeks old, we need a place for them to grow (they were ~8 days old when I bought them).

Here are some pictures of the first collapsible coopette for your viewing pleasure – after which I’ll describe how I put these things together ?

The first one (shown above) was the initial implementation of my idea…in which we decided hinging the access door on the top is less than ideal, and we discovered we need 3 hasps to hold the ends on rather than 2.

Materials used:

  • Pressure treated 1x6x8 fence pickets (bought 29 for both coopettes, ended-up with about 3.5 left over – the second coopette is sturdier (and a little prettier)
  • Half-inch opening, 36″ wide hardware cloth (need ~22′ per coopette; ~30′ if you choose to make bottoms (I opted to not make coopette bottoms this time around)
  • Quarter-inch opening, 24″ wide hardware cloth (happened to have a perfectly-sized piece left from another project I could use on the second coopette door)
  • Staples
  • 1 1/4″ ceramic-coated deck screws
  • 2.5″ hinges (5 per coopette … though I wish I’d gone with 3″ hinges instead)
  • 3″ hasps (7 per coopette)

When folded-up, the sides collapse to ~3″ thick. The ends are about 2″ thick, too.

Total space needed against the side of your garage/shed/etc to store the coopette when you aren’t actively using it is ~3′ x 8′ x 6″, or slightly more than a folding table ?

Construction was very simple – I made the sides a smidge over 36″ wide so that I could attach the hardware cloth without trimming for more than length ?

The ends have a pair of 36″ long boards cut into trapezoids with 30° ends, and a butted ~30″ trapezoid, again with 30° ends (see photo for detail). The butt joint is secured via stapled hardware cloth (wrapped around from the outside to the inside (see photo), and a small covering inside screwed into both upright pieces. I used various pieces of scrap for those butt joint covers

Wrapping the hardware cloth around the ends was the single most time-consuming (and painful!) aspects of construction. Start with a 36″x36″ piece, laid-out square to the bottom of the end. Clamp in place (these 3″ spring clamps from Harbor Freight were a true godsend), and staple as desired … I may have gone a little overboard on the stapling front ?. On the second coopette, I relied more on sandwiching a little extra fence picket material to capture the hardware cloth, and a little less on staples.

Lessons Learned

Prototype 1 was quick-and-dirty – too much stapling, shouldn’t have had the door hinge at the top, needed to be more stable (sandwich the hardware cloth better)

And two hasps holding the ends on is not sufficient – you need three (one more-or-less at each corner) to really keep the end locked well, and to enable easy movement

Prototype 2 was not as dirty … but moving from fence pickets to 5/4 would be preferable

Likewise, wish I had put enough support at the bottom to be able to put some casters on at least one end to facilitate moving around the yard (to prevent killing-out the grass underneath)

What would I do differently in the future?

  • Make them longer than 8 feet (if you use 5/4 deck boards, buy the 10, 12, or 16 foot variety)
  • Make the sides slightly higher than 36″ to reduce the need for cutting hardware cloth (a very time-consuming task!)
  • Add wheels to one end for easy movement
  • Plan for an suspended waterer (the gap at the top happened to be wide enough to sling on up using a little rope and a couple carabiners – but it easily could not have been)
  • Hard-roof one end instead of using a tarp … or use a slightly larger tarp that would cover multiple coopettes at once instead of small ones that cover one at a time

Mark Turner : Jupiter is gone

August 29, 2021 02:00 AM

Jupiter in better days

Jupiter in better days


Today was the day I was hoping would never come, as impossible as it is. Today was the day we said goodbye to Jupiter, our porch cat.

Jupiter wandered into our lives ten years ago, his initial wariness giving way to unabashed love. Once a feral cat darting from home to home, he knew he had found his home when we stopped to feed and love him. The only night he ever spent indoors was his last one, last night.

I am in shambles.

He was the most dog-like cat I’ve ever known. He was super-chill, rarely letting anything faze him. He would come up and hug anyone who happened to stop by to chat. He would sometimes tag along with us when we would take the dog for a walk, trailing behind us and darting from home to home as if he were stalking us. He also always came running when he would hear our front door open or our voices calling to him.

He was a superstar of a cat.

I worried about keeping him outside. He got into a lot of scrapes that way. I was always worried about him and tried to do what I could to keep him as safe a I could short of inviting him inside. One time early on we had a scare when he disappeared for a few days but fortunately he came trotting back. But still I worried. I began to console myself that nature would take its course, whatever way it saw fit. As Travis said when he opted not to come to Jupiter’s euthanasia (and this is perfectly fine – everyone grieves differently), at least we will be aware of his fate. Many cats simply never come home.

It was only weeks ago that I had leapt out of bed in the dead of night, furiously putting on clothes and grabbing the flashlight because in my sleep I had thought I had heard him yowling as if in a fight. I’d get as far as flipping on the porch lights and see him peacefully snoozing on the front porch before I’d turn around and shuffle back to bed. Any time I’d hear that yowl (or thought I did) would have me leaping up.

Often he would scrap with the new neighbor’s cat when that cat would wander over and try to take Jupiter’s food. On occasion, Jupiter’s wounds would be severe enough to earn him a trip to the emergency veterinarian. Needless to say, I’m not a fan of the neighbor’s cat.

One night several years ago I leapt up from my recliner to hear an otherworldly yowl coming from the front of the house. I swung open the door and shone the flashlight across the street to see Jupiter squaring off with a large canid of some kind. I laughed when Jupiter quickly turned tail and trotted over to see me as if he had not a care in the world. It turns out whatever he had been facing was yowling, not Jupiter! I’m not sure if it was a dog, a coyote, or what, but Jupiter was clearly in command of the situation!

He was an easygoing cat that he would even purr in the midst of getting his shots at the vet. The staff there fell in love with him and wanted to take him home. Though I’d manhandle him to put him in his carrier, or give him pills, he would never, ever bite nor scratch me. Somehow I always had his trust.

Being a porch cat, Jupiter was known throughout the neighborhood. Neighbors would wave at him as he surveyed the area from his perch near our door. Some would come up and offer him head scratches. The Herrmann kids and the Ewasyshins girls would often sit on the stoop and get some loving from him. It was sweet to look out and see him being loved on, and returning the love as well.

Unfortunately, a few months ago a friend in Oakdale lost her identically-looking cat, Fred. She stapled posters up all over the neighborhood, causing many to misidentify Jupiter as Fred. One well-meaning but overzealous neighbor on PLainview Avenue scooped Jupter up from our yard and called my friend, thinking she had found Fred. I diplomatically rescued him and did my best to let everyone save face but I was mad that someone would take my cat. I was also kinda mad that Fred’s owner’s posters had led to the catnapping.

Hallie had bonded just as strongly as I had with Jupiter. It was a team effort between us to convince this feral, jumpy, scrawny cat into becoming part of our family. With her away at college, she urged me to keep up with Jupiter and I did the best I could.

Travis had seen Jupiter slowing down. He said Jupiter wasn’t jumping up on the table to eat anymore. Indeed, I had not seen him hanging out the way he had before. Still, his behavior hadn’t crossed the threshold for me to be concerned about him. But he was obviously slowing down some as he had seemed to have lost weight. At his last vet visit in June, the vet reported that he had lost some weight but it wasn’t substantial enough to make any diet changes.

On Wednesday, though, I noticed he had a dark discharge from his eye and nose. He was sneezing and it appeared he had an injury to his left hind leg. I called and had a vet appointment for him within an hour. As soon as the vet examined him, she discovered some nasty bite marks on his abdomen. Some of the tissue had necrotized! She immediately took him into surgery, cut out the dead tissue, sutured the other wounds up as best she could, and gave him some fluids. Jupiter was sent home with a bag of antibiotics, pain pills, and the dreaded Cone of Shame. I took home a cat still dozy from the surgery and set him up on our back porch for recovery.

It has been a dreadfully hot summer, however, and Jupiter was slow to perk up. He was placed in the dogs’ crate with a catbox and a towel to lay on, his cone still firmly affixed around his neck. I soon saw him panting heavily and shallowly and it appeared he was unable to cool himself off. Kelly put up a towel to keep him out of the sun and I continued to try to get him to eat and drink.

Thursday night he finished off most of a can of cat food, so it appeared that we had turned a corner. I was hopeful he would recuperate. We didn’t want to keep him in the heat, though, and debated about where we might go in the house. We settled on moving the crate back to the dining room where it had been for the dogs and keeping him inside for as long as it took. Jupiter seemed to be more comfortable there but he still showed little interest in eating or drinking.

When he showed no interest in his food and drink again this morning, I texted Hallie that she should not pass up a chance to see Jupiter again. She was shocked when she saw him, unable to raise his head to look at her though he could still be heard quietly purring. Soon Hallie was quietly sobbing and I realized my optimism about his recovery was foolishly misplaced. She urged me to take him in to the emergency vet and I hemmed and hawed until I called them an hour later.

Off Jupiter and I went to Wake Veterinary Hospital in Knightdale, with me thinking all the while that we would be inconvenienced by a few hours and that would be that. Sadly, it was to be Jupiter’s last trip.

The hospital was unusually slammed for a Saturday. On the phone, I Was told there may be an hour wait so it was a good time to come by. By the time we arrived, though, the wait had ballooned to 2 to 3 hours. I sat with him in their freezing cold waiting room while the dogs waiting around barked at each other.

Hallie texted me while we waited, saying she was fine if we had to euthanize Jupiter as she had already said goodbye but if we chose to do that, she wanted to be there. This sobering message erased any thoughts I had of walking out of there with our cat.

About an hour into our wait, a staffer named Lori came and got our information. Minutes later, the vet was checking Jupiter out. It was not much longer before Lori was calling me back into the exam room.

The vet doctor’s face told me all I needed to know. She sadly informed me that Jupiter was most likely already suffering from sepsis and possibly organ failure. He was over 10% dehydrated and his body temperature was lower than normal. We could put him through a tough regimen of fluids, drugs, and surgery only to still lose him.

She suggested it might be a good time to consider euthanasia.

Putting down a pet is a heart-rending thing to do. That it’s almost always the right thing to do doesn’t make it any easier. This little cat had so much love for us, and we had grown so used to this love, that the thought of not having this little critter around anymore was to painful to consider. And yet, something I had seen in him the day before – perhaps an imagined knowing in his eyes – made me suspect that he might be ready to go.

“OK, it sounds like he is not going to get better, so let’s do it,” I responded. It was hard to get the words out but there was no doubt they had to be said.

I told her that Kelly would be walking in any minute with some of my things so I would like to wait for her to get here. I also texted Hallie and she dropped what she was doing and began driving over from Carrboro.

Lori brought Jupiter into our room, lying on his towel in the bottom half of his cat carrier. Kelly and I sat on the floor next to him, stroking his head while we said goodbye to him. Though he was so weak he couldn’t lift his head, he was still faintly purring. This cat never, ever held any grudges.

Hallie had arrived a little after 5 PM and the doctor returned to our room. With little fanfare and ceremony, we lifted Jupiter’s carrier onto the table. The doctor administered one shot as a sedative and Jupiter went still. Then the doctor administered a fatal dose of sedative which stopped his heart.

The doctor removed her stethoscope and quietly announced he was gone. I quietly burst into tears.

Kelly offered Hallie a moment alone with Jupiter and moved me out of the room. We sobbed in each others’ arms while Hallie paid her last respects. Then with tears in her eyes, Hallie went back to her homework in Carrboro and we stood quietly over the lifeless body of our porch cat.

I found Lori again and asked if we could collect some of Jupiter’s fur before he was cremated. She helped shave off a few ribbons of it and put it into a plastic bag. Then we turned him over to her and she soon brought back our empty cat carrier.

Kelly and I drove separately back home. I propped the empty carrier onto the seat next to me the way Jupiter always rode with me. Then when I got home I felt compelled to put away all the cat items. I couldn’t bear to look at them anymore.

Now there’s just emptiness where his stuff used to be. There is a hole in my heart, too, which isn’t likely to go away any time soon. This poor, scrawny critter had nothing when we met. He lived his life on the streets and had to become tough.

And yet, we tamed him. We loved him. We fed and took care of him. He had a place to sleep and yet he still had his freedom to explore. I may always wonder if I made the right call in keeping him an outdoor cat, especially after Rocket died and there wasn’t as much of a threat to his safety indoors. I thought I would be taking his freedom away from him, though taking away his freedom is might be what he wanted most.

The cat who always purred in the toughest situations was purring right up until the end. I hope we gave him a good life. Certainly, we rescued him from his feral ways, and along the way he captured our hearts.

It will be a while before I stop looking out for him on the porch, or expecting him to pop out of the bushes as I walk to the mailbox, or stop looking up as I drive by the house to see an orange ball of fur sitting contentedly on our front porch.

Fair winds, my little friend, Jupiter. Until we meet again someday.

Warren Myers : storage series

August 13, 2021 02:50 PM

Some of the content is mildly dated, but this series of posts a few years ago is still something I refer to quite often:

Tarus Balog : On Leaving OpenNMS

August 09, 2021 12:30 PM

It is with mixed emotions that I am letting everyone know that I’m no longer associated with The OpenNMS Group.

Two years ago I was in a bad car accident. I suffered some major injuries which required 33 nights in the hospital, five surgeries and several months in physical therapy. What was surprising is that while I had always viewed myself as somewhat indispensable to the OpenNMS Project, it got along fine without me.

Also during this time, The OpenNMS Group was acquired. For fifteen years we had survived on the business plan of “spend less money than you earn”. While it ensured the longevity of the company and the project, it didn’t allow much room for us to pursue ideas because we had no way to fund them. We simply did not have the resources.

Since the acquisition, both the company and the project have grown substantially, and this was during a global pandemic. With OpenNMS in such a good place I began to think, for the first time in twenty years, about other options.

I started working with OpenNMS in September of 2001. I refer to my professional career before then as “Act I”, with my time at OpenNMS as “Act II”. I’m now ready to see what “Act III” has in store.

While I’m excited about the possibilities, I will miss working with the OpenNMS team. They are an amazing group of people, and it will be hard to replace the role they played in my life. I’m also eternally grateful to the OpenNMS Community, especially the guys in the Order of the Green Polo who kept the project alive when we were starting out. You are and always will be my friends.

When I was responsible for hiring at OpenNMS, I ended every offer letter with “Let’s go do great things”. I consider OpenNMS to be a “great thing” and I am eager to watch it thrive with its new investment, and I will always be proud of the small role I played in its success.

If you are doing great things and think I could contribute to your team, check out my profile on LinkedIn or Xing.

Warren Myers : determining the ‘legitimacy’/’reliability’ of a domain

August 04, 2021 03:19 PM

I’ve recently been asked by several people to investigate websites (especially e-commerce ones) for reliability/legitimateness.

Thought someone else may find my process useful, and/or have some ideas on how to improve it ?

So here goes:

  1. Pop a terminal window (I’m on a Mac, so I open Terminal – feel free to use your terminal emulator of choice (on Windows, you’ll need to have the Subsystem for Linux or Cygwin installed))
    1. Type whois <domain.tld> | less
    2. Look at all of the following:
      • Creation (Creation Date: 2006-02-22T01:12:10Z)
      • Expiration (Registry Expiry Date: 2023-02-22T01:12:10Z)
      • Name server(s) (NS3.PAIRNIC.COM)
      • Registral URL (http://www.pairdomains.com)
      • Registrar (Pair Domains)
      • Contact info (should [generally] be anonymized in some manner)
    3. Possible flags:
      • If the domain’s under 2 years old, and/or the registration period is less than a year (we can talk about when short registrations may make sense in the comments)
      • If the name servers are “out of the country” (which, of course, will vary based on where you are)
      • If the contact info isn’t anonymized
  2. Load the website in question in a browser (use an ingonito and/or proxied tab, if you like) and review the following types of pages:
    • Contact Us
      • Where are they located?
      • Does the location stated match what you expect based on the whois response?
    • About Us
      • Does it read “naturally” in the language it purports to be written in?
        • Ie, does it sound like a native speaker wrote it, or does it sound stiltedly/mechanically translated?
    • Does it match what is in the whois record and the Contact Us page?
    • Do they provide social media links (Twitter, Facebook, LinkedIn, Instagram, etc)?
      • What do their social media presence(s) say about them?
    • Return/Refund Policy (for ecommerce sites only)
      • What is the return window?
      • How much will be charged to send it back and/or restorck it?
    • Shipping Policy (for ecommerce sites only)
      • How long from submitting an order to when it ships to when it arrives?
      • Where is it shipping from?
    • Privacy Policy (only applies if you may be sharing data with them (ecommerce, creating accounts, etc)
      • What do they claim they will (and will not) do with your private information?
  3. Is the site running over TLS/SSL?
    • You should see a little padlock icon in your browser’s address bar
    • Click that icon, and read what the browser reports about the SSL certificate used
    • Given that running over TLS is 100% free, there is absolutely NO reason for a site to NOT use SSL (double especially if they’re purporting to be an ecommerce site)

Reviewing these items usually takes me about 2-3 minutes.

It’s not foolproof (after all, better fools are invented every day), but it can give you a good overview and relative confidence level in the site in question.

Warren Myers : 3-month review

July 13, 2021 09:31 PM

I’ve been running an M1-powered MacBook Pro since late April.

Here’s my experience so far: it Just Works™

That’s it

That’s the tweet

Want more? Sure!

Battery life is bonkers-awesome! I can run for over a full working day with my VDI client, YouTube, web browser sessions, ssh, several chat apps (including video and audio chat) sans being plugged in.

This is the laptop we’ve all wanted.

I half wish (or is it “wish”) I’d gone with the 5G-enabled, M1-powered iPad Pro instead … but this laptop is phenomenal.

There has been nothing I’ve wanted to do in the last 3 months that I could not do with this machine.

Kudos, Apple.

Kudos.

Tarus Balog : Order of the Green Polo: Requiescat In Pace

June 16, 2021 04:02 PM

One of the first “group chat” technologies I was ever exposed to was Internet Relay Chat (IRC). This allowed a group of people to get together in areas called “channels” to discuss pretty much anything they felt like discussing. The service had to be hosted somewhere, and for most open source projects that was Freenode.

You might have seen that recently Freenode was taken over by new management, and the policies this new management implemented didn’t sit well with most Freenode users. In the grand open source tradition, most everyone left and went to other IRC servers, most notably Libera Chat.

In May of 2002 when I became the sole maintainer of OpenNMS, there was exactly one person who was dedicated full time to the project – me. What kept me going was the community I found on IRC, in both the #opennms channel and the local Linux users group channel, #trilug.

It was the people on IRC who supported me until I could grow the business to the point of bringing on more people. I still have strong friendships with many of them.

I was reminded of those early days as we migrated #opennms to Libera Chat. At the moment there are only 12 members logged in, and most of those are olde skoool OpenNMS people. I haven’t used IRC much since we switched to Mattermost (we host a server at chat.opennms.com) and with it a “bridge” to bring IRC conversations into the main Mattermost channel. Most people moved to use Mattermost as their primary client, but of course there were a few holdouts (Hi Alex!).

While I was reminiscing, I was also reminded of the Order of the Green Polo (OGP). When David, Matt and I started The OpenNMS Group in 2004, interest in OpenNMS was growing, and there was a core of those folks on IRC who were very active in contributing to the project. I was trying to think of someway to recognize them.

At that time, business casual, at least for men, consisted of a polo shirt and khaki slacks. Vendors often gifted polo shirts with their logos/logotypes on them to clients, and a number of open source projects sold them to raise money. We sold a white one and a black one, and I thought, hey, perhaps I can pick another color and use that to identify the special contributors to OpenNMS.

Green has always been associated with OpenNMS. In network monitoring, green symbolizes that everything is awesome. We even named one of our professional services products the “Greenlight Project“. Plus I really like green as a color.

Then the question became “what shade of green?” For some reason I thought of Tiger Woods who, by this time, late 2004, had won the prestigious Masters golf tournament three times (and would again the next spring). The winner of that tournament gets a “hunter green” jacket, and so I decided that hunter green would be the color.

Also, for some unknown reason, I saw an article about a British knighthood called “The Order of the Garter“. I combined the two and thus “The Order of the Green Polo” was born.

It was awesome.

People who had been active in contributing to OpenNMS became even more active when I recognized them with the OGP honor. They contributed code and helped us with supporting our community, as well as adding a lot to the direction of the project. We started having annual developer conferences called “Dev-Jam” and OGP members got to attend for free so we could spend some face to face time with each other. I considered these men in the OGP to be my brothers.

As OpenNMS grew, we looked to the OGP for recruitment. It was through the OGP that Alejandro came to the US from Venezuela and now leads our support and services team (if OpenNMS went away tomorrow, getting him and his spouse here would have made it all worth it). When you hired an OGP member, you were basically paying them to do something they wanted to do for free. Think of is as like eating an ice cream sundae and finding money at the bottom.

But that growth was actually something that lead to the decline of the OGP. When we hired everyone that wanted a job with us, the role of the OGP declined. Dev-Jam was open to anyone, but it was mandatory for OpenNMS employees. Not all employees were OGP even though they were full-time contributors, so there was often pressure to induct new employees into the Order. And, most importantly, as we aged many OGP members moved on to other things. Hey, it happens, and it doesn’t reflect poorly on their past contributions.

We had a special mailing list for the OGP, but instead of discussing OpenNMS governance it basically became a “happy birthday” list (speaking of which, Happy Birthday Antonio!). When OpenNMS was acquired by NantHealth, we had to merge our mail systems and in the process the OGP list was deactivated. I don’t think many people noticed.

Recently it was brought to my attention that associating OpenNMS with the Masters golf tournament through the OGP could have negative connotations. The Masters is hosted by the Augusta National Golf Club and there have been controversies around their membership policies and views on race. It was suggested that we rename the OGP to something else.

One quick solution would be to just change the shade of green to, perhaps, a “stoplight” green. But this got me to thinking that the same logic used to associate the color with racism could apply to the whole “Order of” as well, since that was based on a British knighthood which, much like Augusta, is mainly all male. Plus the British don’t have the best track record when it comes to colonialism, etc.

I think it is time for something totally new, so I’ve decided to retire the Order of the Green Polo. The members of the OGP are all male, and I’m extremely excited that as we’ve grown our company and project we have been able to greatly improve our diversity, and I would love to come up with something that can embrace everyone who has a love of OpenNMS and wants to contribute to it, be that through code, documentation, the community, &tc.

OpenNMS has changed greatly over the past two decades, and it has become harder to contribute to a project that has grown exponentially in complexity. As part of my role as the Chief Evangelist of OpenNMS, I want to change that and come up with easier ways for people to improve the OpenNMS platform, and I need to come up with a new program to recognize those who contribute (and if you want to skip that part and get right to the job thingie, we’re hiring, but don’t skip that part).

To those of you who were in the Order of the Green Polo, thank you so much for helping us make OpenNMS what it is today. I’m not sure if it would exist without you. And even without the OGP mailing list, I plan to remember your birthdays.

Mark Turner : One more thing I learned is the value of my blog

June 11, 2021 08:30 PM

One important takeaway from this week’s social media dust-up is the value of having my blog. I liked to pretend that Twitter was more open then Facebook and thus I favored posting there. Yet, when someone falsely accused me there, blocked me, and went on to spread this lie to all of her followers, Twitter left me few, if any, options for getting my response out. It was maddening to watch the rumors spread and have no way of countering them with the truth.

Here, I own my own bits. Here, I decide what gets said. Here, I may solicit discussion or … not. Here, my words live forever.

All that, and I have a goddamn edit button, too.

Mark Turner : On the Internet nobody knows you’re a spook

June 11, 2021 01:12 PM

OS Division, USS ELLIOT, fall 1991

OZ Division, USS ELLIOT DD-967, fall 1991.

Had a dust-up on social media the other day and, frankly, I am still mystified how it all took place.

I tend to follow online and amplify veterans who lean left because the perception of the military consisting of only right-wingers needs to change. A tweet from one of the more popular veterans I follow attracted several good comments. I liked one from a particular veteran (we’ll call her Karen), checked her profile, and followed her when I saw we had something in common: our Navy occupations were in cryptography.

A few days later, she followed me back. I decided to say hi to her in a direct message:

Well, because by brain is perpetually fried lately I goofed in how I specified my rank. I was a CTR2 rather than a CTR5, which – admittedly – doesn’t exist. A second class petty officer is an E-5. I got my numbers mixed up. It doesn’t help that I was juggling three other tasks for my day job and trying to get them done so I could grab lunch. It was an innocent typo. As you can see here, though, I owned up to my mistake and attributed it to my brain not working right anymore from Desert Storm.

I turned my attention away from Twitter and continued working, thinking that was the end of it. A few hours later, I opened Facebook to find that Karen had posted in a group of which we are both members, the Naval Security Group Activity (NSGA) group. Karen made fun of the guy on Twitter who couldn’t get his rate/rank straight, and you know, that was fair game. I responded in a comment that the person she was laughing at was me and that, yes, I had told her immediately that my brain doesn’t work right anymore. Two of my division shipmates weighed in to say I was indeed who I say I am and that her bullying needs to stop.

Now at this point, a functioning adult would have realized her mistake, admitted it, and perhaps offered an apology for not believing me. Karen chose to double-down instead, quizzing me about my health issues which I have shared some of here before but don’t feel like dragging out into the public all the time. Fine, I thought. I extended an olive branch to her in her Facebook post and was willing to move on.

A bit later, I happened to check Karen’s Twitter feed again. Karen had thoughtfully updated her Twitter friends with info from the NSGA thread, saying now she felt like an ass since I had included a photo of me with my division. Still, this apparently wasn’t enough proof get her to move on, much less apologize.

In the NSGA post, she had told me she didn’t name me in her original Twitter post. That is true enough – it wasn’t in her initial post – but it wasn’t long before she gleefully shared it with her followers, causing some of them to block me as well. Discovering this made me very angry. It’s one thing to anonymously laugh at my innocent mistake but to name me, even after being proven wrong?

Karen was proven wrong in a public forum and was chided for bullying me. Again, she could have been a functioning adult but instead dug in. Suddenly, her accusation changed from me claiming stolen valor to me faking my illness! On Twitter, she referenced the USS ELLIOT Wikipedia page and insinuated that because ELLIOT wasn’t present when missiles were flying, obviously I was making up my health issues.

First off, anyone who cites Wikipedia on anything is an idiot. Anyone can edit pages and rarely is anything there checked or edited. Secondly, I wrote about half of the USS ELLIOT Wikipedia page, including nearly all the details of the Westpac 90 and 91 deployments I was on. Thirdly, as the screenshot of our initial chat shows, I never told her I was in Desert Storm – the combat phase with fighting and all that. I simply said my health has not been tthis insanehe same since Desert Storm. My ship deployed to the Persian Gulf during the post-combat phase of Desert Storm, i.e. after the cease-fire. It was still a combat zone, full of stuff that could kill me, and I received combat pay for the three largely-boring months we did circles in the tiny Gulf.

Now, a layperson might think “well, how could he be injured then?” I admit that for the longest time, I doubted myself that I had Gulf War Illness. The fact is, though, my ship entered the Gulf at the height of the Gulf War’s environmental disaster. Thousands of well heads had been set on fire by retreating Iraqi troops, filling the skies with black oil smoke. This smoke followed the prevailing winds and blanketed the Gulf, turning noon into twilight. I have photos and video of us escorting a tanker north to a Kuwaiti terminal and the sky is filled with smoke. In hindsight, I wish I had stayed below decks as I was on the midwatch at the time and only woke up because our captain decided to play tour guide with the 1MC and woke everybody up. I have since wondered if I had stayed inside during that time, maybe my health would be better now.

The oil smoke particles weren’t the only particulates I was breathing in. Extra-fine sand gets carried by the desert wind and can cause silicosis. When topside I would occasionally I had a bandanna over my face but not all the time.

Then there was the nerve gas. I had known that chemical weapons alarms had gone off during the war. I read something afterward that speculated that some of this gas had been released at the wellheads along with the smoke. Now, I don’t know if that is true – and I have strong doubts about it – but I have read that even exposure to tiny amounts of nerve gas is enough to cause permanent damage. I can’t rule it out.

Next, there were the numerous vaccinations we were all required to get. Malaria shots were required, of course, but also an experimental Anthrax vaccine. Some were issued PB pills but I don’t believe I had to take those. At any rate, I was stupid and misplaced my shot card so I am not sure what got put into my body. My service records don’t show much, either.

Finally, there is depleted uranium. Depleted Uranium (DU) is an exceptionally-hard metal which is used as an anti-armor projectile. I was not a gunner’s mate but I was around depleted uranium a few times. As a ship’s photographer, I snapped photos of the DU rounds that were used in our Close-In Weapons System (CWIS), an R2D2-looking system designed to down incoming missiles. Not only that, I have photographed the CWIS in action as it chewed through hundreds of DU rounds during its tests. I absolutely have breathed in dust containing depleted uranium.

All of this to say, while no one may have been actively trying to kill us (save for a floating Iraqi mine), I have plenty of reason to suspect my current health issues are Gulf War-related.

But I shouldn’t have to go into all of that, right? Karen should have been able to take my word on it. I said right from the start that I have health issues. And, hell, my blog URL is part of my Twitter profile. She could’ve searched here for the numerous posts I’ve written about my health, my service, and the Gulf War.

At this point I was incensed.

Tried to say I hadn’t served? Proven wrong. Tried to say I wasn’t a CT? Proven wrong. Tried to say I was using my illness as an excuse? A pathetic attempt at saving face, and while I mentioned my health issues in her Facebook post I can’t just post my medical records online. Furthermore, it’s just batshit insane that, once her initial premise had been proven wrong, she turned to attacking my health.

I weighed in on one of Karen’s Twitter threads about USS ELLIOT. Was I condescending? Fuck yes, I was! I was pissed. She had posted fighting words and continued to post them but left me no way to respond. In hindsight this wasn’t my best move because Karen had been the bully thoughout this episode but now I had given her the cover to claim victimhood.

My posting ticked off some of her followers who then unloaded on me. I traded posts with one of them, mentioning the many shipmates with which I served who met untimely deaths. He quoted to me the text I had written myself on the USS ELLIOT Wikipedia page and that backfired on him.

Rather than continue to talk past each other, I offered to have a video chat with him. I had planned to set up a Zoom call where I would answer any questions he had. He was not brave enough to take me up on the offer, though, so as far as I am concerned he and Karen’s friends aren’t really interested in the truth. I am done engaging them now.

So, what has this episode taught me?

1. Always stand up for your good name, no matter what the cost.
2. I am at my core always willing to poke the bear, even if poking the bear is not in my best interests.
3. There are a surprising number of adults who have never learned the value of admitting when they’re wrong. There are people who won’t change their minds even when presented with a mountain of evidence.
4. Not all sailors are shipmates, even ones who did the same things you did.
5. It’s okay to piss people off, especially if they have it coming. I don’t need everyone to be my friend. This is hard one for me to accept but it’s true. As someone said, “you are the villain in someone else’s story.” So be it.
6. Value the people who do have your back. I am grateful for the friends who stood with me.

I requested that the NSGA group post be removed and it was, thankfully. The Twitter stigma remains, however. Social media is great until it isn’t, I guess, and expecting to have a worthwhile conversation with total strangers on the Internet is an impossibly high bar.

Now, on to other things!

Update 16:02: Hi folks! If you want to learn more about my super-fun experiences with GWI, check these posts here and also here and here, especially. Keep in mind that these are only a very small sample of the episodes that I’ve had. You might uncover some I’ve missed here with a search of my blog’s Health category.

Tarus Balog : What’s Old Is New Again

May 12, 2021 03:46 PM

Today we launched a new look for OpenNMS, a rebranding effort that has been going on for the better part of a year. It represents a lot more than just a new logo and new colors. While OpenNMS has been around for over two decades now, it is also quite different from when it started. A tremendous amount of work has gone into the project over the past couple of years, and if you looked at using it even just a short while ago you will be surprised at what has changed.

New OpenNMS Logo

One of the best analogies I can come up with to talk about the “new” OpenNMS concerns cars. I like cars, especially Mercedes, and when I was in college I usually drove an older Mercedes sedan. I enjoyed bringing them back to their former glory (and old, somewhat beaten down cars were all I could afford), and so I might start by redoing the brake system, overhauling the engine, etc.

When I would run out of money, which was often, sometimes I’d have to sell a car. Prospective buyers would often complain that the paint wasn’t perfect or there was an issue with the interior. I’d point out that you could hop in this car right now and drive it across the country and never worry about breaking down, but they seemed focused on how it looked. Cosmetics are usually the last thing you focus on during a restoration, but it tends to be the first thing people see.

This is very much like OpenNMS. For over a decade we’ve been focused on the internals of the platform, and luckily we are now in a position to focus on how it looks.

Please don’t misunderstand: application usability is important, much more important than, say, the paint job on a car, but in order to provide the best user experience we had to start by working under the hood.

For example, from the beginning OpenNMS has contained multiple “daemons” that control various aspects of the platform. Originally this was very monolithic, and thus any small change to one of them would often require restarting the whole application.

OpenNMS is now based on a Karaf runtime which provides a modular way of managing the various features within the application. It comes with a shell that can allow even non-Java programmers access to both high and low level parts of the platform, and to make changes without restarting the whole thing. Features can be enabled and disabled on the fly, and it is easy to test the behavior of OpenNMS against a particular device without having to set up a special test environment to pore through pages of logs.

Another great aspect of OpenNMS is that much of the internal messaging can now take place through a broker such as Kafka. While this increases the stability and flexibility of the platform, users can also create custom consumers for the huge amounts of information OpenNMS is able to collect. For very large networks this creates the option to use that data outside of the platform itself, giving end users a high level of custom observablity.

The monolithic nature of OpenNMS has also been improved. The addition of “Minions” to provide monitoring at the edge of the network creates numerous monitoring solutions where there was none before. You can now reach into isolated or private networks, or monitor the performance of applications from various locations seamlessly. The “Sentinel” project allows the various processes within OpenNMS to be spread out over multiple devices with the aim to have virtually unlimited scale.

APM Example World Map

And I haven’t even started on the ability of OpenNMS to monitor tremendous amounts of telemetry data and to analyze it with tools such as “Nephron” or our foray into artificial intelligence with ALEC.

So much has changed with OpenNMS, much of it recently, that it was time for that new coat of paint. It was time for people to both notice the new look of OpenNMS at the surface, and the new OpenNMS under the covers.

One thing that hasn’t changed is that OpenNMS is still 100% open source. All of these amazing features are available to anyone under an OSI approved open source license. Plus we leverage and integrate with best-in-class open source tools such as Grafana for visualization and Cassandra (using Newts) for storing time series data.

Our new logo is a stylized gyroscope. For centuries the gyroscope has represented a way to maintain orientation in the most chaotic of situations. In much the same way, OpenNMS helps you maintain the orientation of your IT infrastructure which, let’s admit it, plays a huge role in the success of your enterprise.

Where the car analogy falls apart is that while the paint job is usually the end of a restoration, this new look for OpenNMS is just the beginning of a new chapter in the history of the project. Our goal is to create a platform where monitoring just happens. We’re not there yet, but check out the latest OpenNMS and we hope you’ll agree we are getting closer.

Warren Myers : sometimes i’m off

April 27, 2021 09:23 PM

It took Apple 5.5 (or 6, if you count last week as really hitting it) years to introduce what I called the MacBook Flex back in 2015.

With the 13″ MacBook Pro available in an M1-powered edition (which is so much better than the top-end MBP from 2019…it’s not even funny), and now a 5G-enabled iPad Pro running on the M1 … it’s here.

Mark Turner : Practicing my OSINT skills

April 18, 2021 02:23 AM

Yesterday, a story went viral of a North Carolina man and woman who fought off an attack by a rabid bobcat. This story made news all over the world (it was a slow news Friday, I suppose) but I became annoyed that none of the stories mentioned who the victims were. I thought this might make a good opportunity to use my Open Source Intelligence (OSINT) skills to try to identify them based on what was known so far. And what do you know, I managed to do it!

Since I hadn’t seen that the couple had granted any interviews anywhere, I figured they were not interested in publicity and I opted not to mention their names publicly. I now see that Wilmington station WECT has interviewed them so I can reveal my work. The folks involved are good people and I don’t want my post to be used to harass them so I will focus on my techniques rather than their identity.

So, at the start of this journey all I had was the video. You see them leaving their house in the morning and getting attacked by the bobcat as they attempt to get into their car. The man pulls the animal off of his wife and flings it into the yard before they escape. It’s quite wild.


We can learn a lot from studying this video.

1. We know it’s morning. On the audio, we hear the man greet the passing jogger with “Good morning.” He puts his coffee on his car.
2. Sunlight is visible on the homes in the background. Since it’s morning, we know we are facing west.
3. Key identifying items are visible. The home is near a curve. The sidewalk ends on the right side of their yard. There is a storm catch basin directly across from the driveway.

All these things help separate this home from others in the neighborhood.

Listening carefully to the clip we hear the wife frantically calling her husband’s name, Happy.

An additional bit of information came to me. Someone mentioned it happened in North Carolina; Pender County to be exact. Another person mentioned a bulletin recently put out by the Pender County Sheriff’s Department. It was a notice warning that a rabid bobcat had attacked people and to make sure all of your pets are vaccinated. It said the attack occurred on April 9th, the dead bobcat was tested at N.C. State, and the bobcat was found to have had rabies. It also mentioned that this had happened in the Creekside Subdivision of Burgaw, NC.

So, bobcats are normally extremely shy. They are nocturnal and avoid people at all costs. Interactions with people are thus very rare. It looked like the Pender County Sheriff’s bulletin must have been referring to the attack in the video.

So we know a few things so far:
1. Distinctive features of Happy and his wife’s home.
2. That their home faces west.
3. That the home is likely in the Creekside Subdivision of Pender County.

Time to go to Google Maps and see what we can find.

I put “Creekside Burgaw, NC” in to the search bar and I’m presented with a promising result:

This is a nice, compact subdivision so it shouldn’t take long to narrow down the house. Unfortunately Google Street View has not yet snapped on-the-ground photographs so we’ll have to figure this out ourselves. The subdivision only has one turn on the west side, so immediately I’m looking in the lower right corner for our house.

There is only one house shown on the street map that meets our criteria. But the video shows houses to the left of our house, among others. Where are the houses? It appears Google is not up-to-date on the construction here.

Let’s check the satellite view, as sometimes these two can differ:

No luck. We see an image of the home under construction that’s shown on the street map but none of the surrounding houses. Time to look for a more up-to-date map of the area.

Let’s check the Pender County GIS site. Local governments tend to have GIS systems to maintain more accurate maps for their tax assessments, first responders, and the like. Pender County is no exception and has a very easy to use GIS system. Let’s see what the subdivision looks like in county records:

Bingo. We now have a map showing all the lots in the Creekside subdivision. There’s a home on the western side and in the correct spot in relation to the curve. It also abuts a pond and thus has the sidewalk ending at the property line.

Perfect. This is our house.

GIS tells us the owner, too. In case you were wondering, Happy is only a nickname. Happy’s formal name is Leon H. Wade III. It is common for family members who share names to become known primarily by nicknames. To be sure this is the person we want, A search on his nickname and last name (which is not a common name and thus has higher confidence if we find a match) turns up a LinkedIn page for a gentleman who works in Wilmington, NC, which is the largest city in Pender County.

We can now say with some confidence that we found the person we were looking for. These same techniques can be used to help locate other photographs (for instance, photos of an adversary) and identify the persons in those photographs. It’s also fun.

Warren Myers : think-read-speak

March 24, 2021 09:12 PM

deeply-broadly-carefully

think-read-speak deeply-broadly-carefully

Please feel free to use/share/copy/adapt this image

Tarus Balog : OpenNMS Resources

February 25, 2021 07:22 PM

Getting started with OpenNMS can be a little daunting, so I thought I’d group together some of the best places to start.

When OpenNMS began 20+ years ago, the main communication channel was a group of mailing lists. For real time interaction we added an “#opennms” IRC channel on Freenode as well. As new technology came along we eagerly adopted it: hosting forums, creating a FAQ with FAQ-o-matic, building a wiki, writing blogs, etc.

The problem became that we had too many resources. Many weren’t updated and thus might host obsolete information, and it was hard for new users to find what they wanted. So a couple of years ago we decided to focus on just two main places for community information.

We adopted Discourse to serve as our “asynchronous” communication platform. Hosted at opennms.discourse.group the goal is to migrate all of our information that used to reside on sites like FAQs and wikis to be in one place. In as much as our community has a group memory, this is it, and we try to keep the information on this site as up to date as possible. While there is still some information left in places like our wiki, the goal is to move it all to Discourse and thus it is a great place to start.

I also want to call your attention to “OpenNMS on the Horizon (OOH)”. This is a weekly update of everything OpenNMS, and it is a good way to keep up with all the work going on with the platform since a lot of the changes being made aren’t immediately obvious.

While we’ve been happy with Discourse, sometimes you just want to interact with someone in real time. For that we created chat.opennms.com. This is an instance of Mattermost that we host to provide a Slack-like experience for our community. It basically replaces the IRC channel, but there is also a bridge between IRC and MM so that posts are shared between the two. I am “sortova” on Mattermost.

When you create an account on our Mattermost instance you will be added to a channel called “Town Square”. Every Mattermost instance has to have a default channel, and this is ours. Note that we use Town Square as a social channel. People will post things that may be of interest to anyone with an interest in OpenNMS, usually something humorous. As I write this there are over 1300 people who have signed up on Town Square.

For OpenNMS questions you will want to join the channel “OpenNMS Discussion”. This is the main place to interact with our community, and as long as you ask smart questions you are likely to get help with any OpenNMS issues you are facing. The second most popular channel is “OpenNMS Development” for those interested in working with the code directly. The Minion and Compass applications also have their own channels.

Another channel is “Write the Docs”. Many years ago we decided to make documentation a key part of OpenNMS development. While I have never read any software documentation that couldn’t be improved, I am pretty proud of the work the documentation team has put into ours. Which brings me to yet another source of OpenNMS information: the official documentation.

Hosted at docs.opennms.org, our documentation is managed just like our application code. It is written in AsciiDoc and published using Antora. The documentation is versioned just like our Horizon releases, but usually whenever I need to look something up I go directly to the development branch. The admin guide tends to have the most useful information, but there are guides for other aspects of OpenNMS as well.

The one downside of our docs is that they tend to be more reference guides than “how-to” articles. I am hoping to correct that in the future but in the meantime I did create a series of “OpenNMS 101” videos on YouTube.

They mirror some of our in-person training classes, and while they are getting out of date I plan to update them real soon (we are in the process of getting ready for a new release with lots of changes so I don’t want to do them and have to re-do them soon after). Unfortunately YouTube doesn’t allow you to version videos so I’m going to have to figure out how to name them.

Speaking of changes, we document almost everything that changes in OpenNMS in our Jira instance at issues.opennms.org. Every code change that gets submitted should have a corresponding Jira issue, and it is also a place where our users can open bug reports and feature requests. As you might expect, if you need to open a bug report please be as detailed as possible. The first thing we will try to do is recreate it, so having information such as the version of OpenNMS you are running, what operating system you are using and other steps to cause the problem are welcome.

If you would like us to add a feature, you can add a Feature Request, and if you want us to improve an existing feature you can add an Enhancement Request. Note that I think you have to have an account to access some of the public issues on the system. We are working to remove that requirement as we wish to be as transparent as possible, but I don’t think we’ve been able to get it to work just yet. I just attempted to visit a random issue and it did load but it was missing a lot of information that shows up when I go to that link while authenticated, such as the left menu and the Git Integration. You will need an account to open or comment on issues. There is no charge to open an account, of course.

Speaking of git, there is one last resource I need to bring up: the code. We host our code on Github, and we’ve separated out many of our projects to make it easier to manage. The main OpenNMS application is under “opennms” (naturally) but other projects such as our machine learning feature, ALEC, have their own branch.

While it was not my intent to delve into all things git on this post, I did want to point out than in the top level directory of the “opennms” project we have two scripts, makerpm.sh and makedeb.sh that you can use to easily build your own OpenNMS packages. I have a video queued up to go over this in detail, but to build RPMs all you’ll need is a base CentOS/RHEL install, and the packages “git” (of course), “expect”, “rpm-build” and “rsync”. You’ll also need a Java 8 JDK. While we run on Java 11, at the moment we don’t build using it (if you check out the latest OOH you’ll see we are working on it). Then you can run makerpm.sh and watch the magic happen. Note the first build takes a long time because you have to download all of the maven dependencies, but subsequent builds should be faster.

To summarize:

For normal community interaction, start with Discourse and use Mattermost for real time interaction.

For reference, check out our documentation and our YouTube channel.

For code issues, look toward our Jira instance and our Github repository.

OpenNMS is a powerful monitoring platform with a steep learning curve, but we are here to help. Our community is pretty welcoming and hope to see you there soon.

Tarus Balog : Open Source Contributor Agreements

February 24, 2021 04:41 PM

I noticed a recent uptick in activity on Twitter about open source Contributor License Agreements (CLAs), mostly negative.

Twitter Post About CLAs

The above comment is from a friend of mine who has been involved in open source longer than I have, and whose opinions I respect. On this issue, however, I have to disagree.

This is definitely not the first time CLAs have been in the news. The first time I remember even hearing about them concerned MySQL. The MySQL CLA required a contributor to sign over ownership of any contribution to the project, which many thought was fine when they were independent, but started to raise some concerns when they were acquired by Sun and then Oracle. I think this latest resurgence is the result of Elastic deciding to change their license from an open source one to something more “open source adjacent”. This has caused a number of people take exception to this (note: link contains strong language).

As someone who doesn’t write much code, I think deciding to sign a CLA is up to the individual and may change from project to project. What I wanted to share is a story of why we at OpenNMS have a CLA and how we decided on one to adopt, in the hopes of explaining why a CLA can be a positive thing. I don’t think it will help with the frustrations some feel when a project changes the license out from under them, but I’m hoping it will shed some light on our reasons and thought processes.

OpenNMS was started in 1999 and I didn’t get involved until 2001 when I started work at Oculan, the commercial company behind the project. Oculan built a monitoring appliance based on OpenNMS, so while OpenNMS was offered under the GPLv2, the rest of their product had a proprietary license. They were able to do this because they owned 100% of the copyright to OpenNMS. In 2002 Oculan decided to no longer work on the project, and I was able to become the maintainer. Note that this didn’t mean that I “owned” the OpenNMS copyright. Oculan still owned the copyright but due to the terms of the license I (as well as anyone else) was free to make derivative works as long as those works adhered to the license. While the project owned the copyright to all the changes made since I took it over, there was no one copyright holder for the project as a whole.

This is fine, right? It’s open source and so everything is awesome.

Fast forward several years and we became aware of a company, funded by VCs out of Silicon Valley, that was using OpenNMS in violation of the license as a base on which to build a proprietary software application.

I can’t really express how powerless we felt about this. At the time there were, I think, five people working full time on OpenNMS. The other company had millions in VC money while we were adhering to our business model of “spend less than you earn”. We had almost no money for lawyers, and without the involvement of lawyers this wasn’t going to get resolved. One thing you learn is that while those of us in the open source world care a lot about licenses, the world at large does not. And since OpenNMS was backed by a for-profit company, there was no one to help us but ourselves (there are some limited options for license enforcement available to non-profit organizations).

We did decide to retain the services of a law firm, who immediately warned us how much “discovery” could cost. Discovery is the process of obtaining evidence in a possible lawsuit. This is one way a larger firm can fend off the legal challenges of a smaller firm – simply outspend them. It made use pretty anxious.

Once our law firm contacted the other company, the reply was that if they were using OpenNMS code, they were only using the Oculan code and thus we had no standing to bring a copyright lawsuit against them.

Now we knew this wasn’t true, because the main reason we knew this company was using OpenNMS was that a disgruntled previous employee told us about it. They alleged that this company had told their engineers to follow OpenNMS commits and integrate our changes into their product. But since much of the code was still part of the original Oculan code base, it made our job much more difficult.

One option we had was to get with Oculan and jointly pursue a remedy against this company. The problem was that Oculan went out of business in 2004, and it took us awhile to find out that the intellectual property had ended up Raritan. We were able to work with Raritan once we found this out, but by this time the other company also went out of business, pretty much ending the matter.

As part of our deal with Raritan, OpenNMS was able to purchase the copyright to the OpenNMS code once owned by Oculan, granting Raritan an unlimited license to continue to use the parts of the code they had in their products. It wasn’t cheap and involved both myself and my business partner using the equity in our homes to guarantee a loan to cover the purchase, but for the first time in years most of the OpenNMS copyright was held by one organization.

This process made us think long and hard about managing copyright moving forward. While we didn’t have thousands of contributors like some projects, the number of contributors we did have was non-trivial, and we had no CLA in place. The main question was: if we were going to adopt a CLA, what should it look like? I didn’t like the idea of asking for complete ownership of contributions, as OpenNMS is a platform and while someone might want to contribute, say, a monitor to OpenNMS, they shouldn’t be prevented from contributing a similar monitor to Icinga or Zabbix.

So we asked our our community, and a person named DJ Gregor suggested we adopt the Sun (now Oracle) Contributor Agreement. This agreement introduced the idea of “dual copyright”. Basically, the contributor keeps ownership of their work but grants copyright to the project as well. This was a pretty new idea at the time but seems to be common now. If you look at CLAs for, say, Microsoft and even Elastic, you’ll see similar language, although it is more likely worded as a “copyright grant” or something other than “dual copyright”.

This idea was favorable to our community, so we adopted it as the “OpenNMS Contributor Agreement” (OCA). Now the hard work began. While most of our active contributors were able to sign the OCA, what about the inactive ones? With a project as old as OpenNMS there are a number of people who had been involved in the project but due to either other interests or changing priorities they were no longer active. I remember going through all the contributions in our code base and systematically hunting down every contributor, no matter how small, and asking them to sign the OCA. They all did, which was nice, but it wasn’t an easy task. I can remember the e-mail of one contributor bounced and I finally hunted them down in Ireland via LinkedIn.

Now a lot of the focus of CLAs is around code ownership, but there is a second, often more important part. Most CLAs ask the contributor to affirm that they actually own the changes they are contributing. This may seem trivial, but I think it is important. Sure, a contributor can lie and if it turns out they contributed something they really didn’t own the project is still responsible for dealing with that code, but there are a number of studies that have shown that simply reminding someone about a moral obligation goes a long way to reinforce ethical behavior. When someone decides to sign a CLA with such a clause it will at least make them think about it and reaffirm that their work is their own. If a project doesn’t want to ask for a copyright assignment or grant, they should at least ask for something like this.

While the initial process was pretty manual, currently managing the OCAs is pretty automated. When someone makes a pull request on our Github project, it will check to see if they have signed the OCA and if not, send them to the agreement.

The fact that the copyright was under one organization came in handy when we changed the license. One of my favorite business models for open source software is paid hosting, and I often refer to WordPress as an example. WordPress is dead simple to install, but it does require that you have your own server, understand setting up a database, etc. If you don’t want to do that, you can pay WordPress a fee and they’ll host the product for you. It’s a way to stay pure open source yet generate revenue.

But what happens if you work on an open source project and a much bigger, much better funded company just takes your project and hosts it? I believe one of the issues facing Elastic was that Amazon was monetizing their work and they didn’t like it. Open source software is governed mainly by copyright law and if you don’t distribute a “copy” then copyright doesn’t apply. Many lawyers would claim that if I give you access to open source software via a website or an API then I’m not giving you a copy.

We dealt with this at OpenNMS, and as usual we asked our community for advice. Once again I think it was DJ who suggested we change our license to the Affero GPL (AGPLv3) which specifically extends the requirement to offer access to the code even if you only offer it as a hosted service. We were able to make this change easily because the copyright was held by one entity. Can you imagine if we had to track down every contributor over 15+ years? What if a contributor dies? Does a project have to deal with their estate or do they have to remove the contribution? It’s not easy. If there is no copyright assignment, a CLA should at least include detailed contact information in case the contributor needs to be reached in the future.

Finally, remember that open source is open source. Don’t like the AGPLv3? Well you are free to fork the last OpenNMS GPLv2 release and improve it from there. Don’t like what Elastic did with their license? Feel free to fork it.

You might have detected a theme here. We relied heavily on our community in making these decisions. The OpenNMS Group, as stewards of the OpenNMS Project, takes seriously the responsibilities to preserve the open source nature of OpenNMS, and I like to think that has earned us some trust. Having a CLA in place addresses some real business needs, and while I can understand people feeling betrayed at the actions of some companies, ultimately the choice is yours as to whether or not the benefits of being involved in a particular project outweigh the requirement to sign a contributor agreement.

Tarus Balog : The Server Room Show Podcast

February 23, 2021 04:05 PM

A couple of weeks ago I had the pleasure to chat with Viktor Madarasz on “The Server Room Show” podcast.

The Server Room Podcast Graphic

Viktor is an IT professional with a strong interest in open source, and we had a fun and meandering conversation covering a number of topics. As usual, I talked to much so he ended up splitting our conversation across two episodes.

You can visit his website for links to the podcast from a large variety of podcast sources, or you can listen on Youtube to part one and part two.

It was fun, and I hope to be able to chat again sometime in the future.

Note: Viktor is originally from Hungary, as was my grandfather. I tried to make getting some Túró Rudi part of my appearing on the show, but unfortunately we haven’t figured out how to get it outside of Hungary, and we all know that I’d talk about open source for free pretty much any time and any place.

Tarus Balog : Thoughts on Security and Open Source Software

February 22, 2021 02:15 PM

Due to the recent supply-chain attack on Solarwinds products, I wanted to put down a few thoughts on the role of open source software and security. It is kind of a rambling post and I’ll probably lose all three of my readers by the end, but I found it interesting to think about how we got here in the first place.

I got my first computer, a TRS-80, as a Christmas present in 1978 from my parents.

Tarus and his TRS-80

As far as I know, these are the only known pictures of it, lifted from my high school yearbook.

Now, I know what you are thinking: Dude, looking that good how did you find the time off your social calendar to play with computers? Listen, if you love something, you make the time.

(grin)

Unlike today, I pretty much knew about all of the software that ran on that system. This was before “open source” (and before a lot of things) but since the most common programming language was BASIC, the main way to get software was to type in the program listing from a magazine or book. Thus it was “source available” at least, and that’s how I learned to type as well as being introduced to the “syntax error”. That cassette deck in the picture was the original way to store and retrieve programs, but if you were willing to spend about the same amount as the computer cost you could buy an external floppy drive. The very first program I bought on a floppy was from this little company called Microsoft, and it was their version of the Colossal Cave Adventure. Being Microsoft it came on a specially formatted floppy that tried to prevent access to the code or the ability to copy it.

And that was pretty much the way of the future, with huge fortunes being built on proprietary software. But still, for the most part you were aware of what was running on your particular system. You could trust the software that ran on your system as much as your could trust the company providing it.

Then along comes the Internet, the World Wide Web and browsers. At first, browsers didn’t do much dynamically. They would reach out and return static content, but then people started to want more from their browsing experience and along came Java applets, Flash and JavaScript. Now when you visit a website it can be hard to tell if you are getting tonight’s television listings or unknowingly mining Bitcoin. You are no longer in charge of the software that you run on your computer, and that can make it hard to make judgements about security.

I run a number of browsers on my computer but my default is Firefox. Firefox has a cool plugin called NoScript (and there are probably similar solutions for other browsers). NoScript is an extension that lets the user choose what JavaScript code is executed by the browser when visiting a page. A word of warning: the moment you install NoScript, you will break the Internet until you allow at least some JavaScript to run. It is rare to visit a site without JavaScript, and with NoScript I can audit what gets executed. I especially like this for visiting sensitive sites like banks or my health insurance provider.

Speaking of which, I just filed a grievance with Anthem. We recently switched health insurance companies and I noticed that when I go to the login page they are sending information to companies like Google, Microsoft (bing.com) and Facebook. Why?

Blocked JavaScript on the Anthem Website

I pretty much know the reason. Anthem didn’t build their own website, they probably hired a marketing company to do it, or at least part of it, and that’s just the way things are done, now. You send information to those sites in order to get analytics on who is visiting your site, and while I’m fine with it when I’m thinking about buying a car, I am not okay with it coming from my insurance company or my bank. There are certain laws governing such privacy, with more coming every day, and there are consequences for violating it. They are supposed to get back to me in 30 days to let me know what they are sending, and if it is personal information, even if it is just an IP Address, it could be a violation.

I bring this up in part to complain but mainly to illustrate how hard it is to be “secure” with modern software. You would think you could trust a well known insurance company to know better, but it looks like you can’t.

Which brings us back to Solarwinds.

Full disclosure: I am heavily involved in the open source network monitoring platform OpenNMS. While we don’t compete head to head with Solarwinds products (our platform is designed for people with at least a moderate amount of skill with using enterprise software while Solarwinds is more “pointy-clicky”) we have had a number of former Solarwinds users switch to our solution so we can be considered competitors in that fashion. I don’t believe we have ever lost a deal to Solarwinds, at least one in which our sales team was involved.

Now, I wouldn’t wish what happened to Solarwinds on my worst enemy, especially since the exploit impacted a large number of US Government sites and that does affect me personally. But I have to point out the irony of a company known for criticizing open source software, specifically on security, to let this happen to their product. Take this post from on of their forums. While I wasn’t able to find out if the author worked at Solarwinds or not, they compare open source to “eating from a dirty fork”.

Seriously.

But is open source really more secure? Yes, but in order to explain that I have to talk about types of security issues.

Security issues can be divided into “unintentional”, i.e. bugs, and “intentional”, someone actively trying to manipulate the software. While all software but the most simple suffers from bugs, what happened to the Solarwinds supply chain was definitely intentional.

When it comes to unintentional security issues, the main argument against open source is that since the code is available to anyone, a bad actor could exploit a security weakness and no one would know. They don’t have to tell anyone about it. There is some validity to the argument but in my experience security issues in open source code tend to be found by conscientious people who duly report them. Even with OpenNMS we have had our share of issues, and I’d like to talk about two of them.

The first comes from back in 2015, and it involved a Java serialization bug in the Apache commons library. The affected library was in use by a large number of applications, but it turns out OpenNMS was used as a reference to demonstrate the exploit. While there was nothing funny about a remote code execution vulnerability, I did find it amusing that they discovered it with OpenNMS running on Windows. Yes, you can get OpenNMS to run on Windows, but it is definitely not easy so I have to admire them for getting it to work.

I really didn’t admire them for releasing the issue without contacting us first. Sending an email to “security” at “opennms.org” gets seen by a lot of people and we take security extremely seriously. We immediately issued a work around (which was to make sure the firewall blocked the port that allowed the exploit) and implemented the upgraded library when it became available. One reason we didn’t see it previously is that most OpenNMS users tend to run it on Linux and it is just a good security practice to block all but needed ports via the firewall.

The second one is more recent. A researcher found a JEXL vulnerability in Newts, which is a time series database project we maintain. They reached out to us first, and not only did we realize that the issue was present in Newts, it was also present in OpenNMS. The development team rapidly released a fix and we did a full disclosure, giving due credit to the reporter.

In my experience that is the more common case within open source. Someone finds the issue, either through experimentation or by examining the code, they communicate it to the maintainers and it gets fixed. The issue is then communicated to the community at large. I believe that is the main reason open source is more secure than closed source.

With respect to proprietary software, it doesn’t appear that having the code hidden really helps. I was unable to find a comprehensive list of zero-day Windows exploits but there seem to be a lot of them. I don’t mean to imply that Windows is exceptionally buggy but it is a common and huge application and that complexity lends itself to bugs. Also, I’m not sure if the code is truly hidden. I’m certain that someone, somewhere, outside of Microsoft has a copy of at least some of the code. Since that code isn’t freely available, they probably have it for less than noble reasons, and one can not expect any security issues they find to be reported in order to be fixed.

There seems to be this misunderstanding that proprietary code must somehow be “better” than open source code. Trust me, in my day I’ve seen some seriously crappy code sold at high prices under the banner of proprietary enterprise software. I knew of one company that wrote up a bunch of fancy bash scripts (not that there is anything wrong with fancy bash scripts) and then distributed them encrypted. The product shipped with a compiled program that would spawn a shell, decrypt the script, execute it and then kill the shell.

Also, at OpenNMS we rely heavily on unit tests. When a feature is developed the person writing the code also creates code to “test” the feature to make sure it works. When we compile OpenNMS the tests are run to make sure the changes being made didn’t break anything that used to work. Currently we have over 8000 of these tests. I was talking to a person about this who worked for a proprietary software company and he said, “oh, we tried that, but it was too hard.”

Finally, I want to get back to that other type of security issue, the “intentional” one. To my understanding, someone was able to get access to the servers that built and distributed Solarwinds products, and they added in malware that let them compromise target networks when they upgraded their applications. Any way you look at it, it was just sloppy security, but I think the reason it went on for so long undetected is that the whole proprietary process for distributing the software was limited to so few people it was easy to miss. These kind of attacks happen in open source projects, too, they just get caught much faster.

That is the beauty of being able to see the code. You have the choice to build your own packages if you want, and you can examine code changes to your hearts content.

We host OpenNMS at Github. If you check out the code you could run something like:

git tag --list

to see a list of release tags. As I write this the latest released version of Horizon is 26.0.1. To see what changed from 26.0.0 I can run

git log --no-merges opennms-26.0.0-1 opennms-26.0.1-1

If you want, there is even a script to run a “release report” which will give you all of the Jira issues referenced between the two versions:

git-release-report opennms-26.0.0-1 opennms-26.0.1-1

While that doesn’t guarantee the lack of malicious code, it does put the control back into your hands and the hands of many others. If something did manage to slip in, I’m sure we’d catch it long before it got released to our users.

Security is not easy, and as with many hard things the burden is eased the more people who help out. In general open source software is just naturally better at this than proprietary software.

There are only a few people on this planet who have the knowledge to review every line of code on a modern computer and understand it, and that is with the most basic software installed. You have to trust someone and for my peace of mind nothing beats the open source community and the software they create.

Mark Turner : Not throwing away my shot

February 13, 2021 05:28 PM

Durham VA COVID Clinic sign

Durham VA COVID Clinic sign


I got the first of two COVID-19 vaccination shots on Saturday. For several years the Veterans Administration (VA) has been providing my healthcare. About two weeks ago I asked my doctor there if it was possible to get a shot. I stressed that I did not want to take one away from anyone else but if one were available I would love to get it. Thankfully, the VA has made it a priority that every veteran who wants a shot can get a shot. To my surprise, I got a call a day later! I was to be at the Durham VA on Saturday, 6 February at 9 AM to get my COVID-19 vaccination.

Our son Travis has been eager to get his vaccination, too, so in the offchance that he could pick up a shot, too, he accompanied me to the Durham VA. We hit the road shortly after 8 AM and drove through mostly-empty streets to Durham.

We arrived to a somewhat chaotic scene. One of the VA’s parking decks has been undergoing repairs for the past several months and parking has been tight even on a usual day. This day, there was a stream of veteran patients all arriving at the same time for their COVID shots. Though we got there at 8:35 for a 9 AM appointment, it took several minutes to find an open parking spot. Reaching the top level of the deck, we hopped out and headed to the walkway.

It was at the start of the walkway that I stopped to read the sign on the floor. No walk-in shots would be available. Regretfully, I turned to Travis and told him today would not be his day. With sadness, he turned back and waited for me in the car.

I joined a scrum of people waiting in line after line. First was the typical COVID risk screening at the entrance. A woman studied the masks worn by the visitors and switched out ones that didn’t meet her standards. I was amused when I was asked to trade the NIOSH-certified N95 mask that has protected me for months for an uncertified KN95 mask I was given. This was even more amusing when visitors with surgical masks that are not nearly as protective as my N95 were allowed to continue wearing them. Could it be that I know more about mask protection than healthcare workers?

The vaccination line at Durham VA

The vaccination line at Durham VA

After being asked travel and symptom questions and being scanned by an infrared temperature checker, I was given a green sticker to wear and joined a long line snaking through the first floor of the hospital. First I was handed a vaccination card and asked to fill out my name, birthdate, and last four of my SSN. I soon presented this to a woman with a laptop who was checking people in while they waited in line. This earned me an additional pink sticker to wear. Then I waited as the socially-distanced line slowly made its way down the hallways to the clinic area of the hospital.

I was happy to see so many people of color in line, too. Some Black people are wary of vaccinations and that is understandable, given the horrifying policies and experiments that were carried out on Black communities in the past. While I am not Black, I, too was once cautious of government vaccinations. I’ve often wondered whether the shots and pills I received in the military prior to Desert Storm might be responsible for the mysterious health issues I suffer from today. While I may never fully know about those military shots, I have no qualms about taking the COVID vaccine as I fully recognize that whatever side effects the vaccine brings on are trivial compared to the damage COVID-19 can ravage on my body. This is an easy call.

I was now at a crossroads of sorts. A woman with a paddle sign stood in the middle of the clinic hallway, watching a staffer at either the end of hall. Each had paddle signs that had a red “thumbs down” on one side and a green “thumbs up” sign on the other. When one of the end-hallway staffers would give the signal indicating an open seat, the gatekeeper staffer would direct the patient to that end. I was sent to the left, where I took a seat at the end a long hallway with clinic rooms on either side and veterans seated outside of each room. I spent the next ten minutes or so watching as newly-vaccinated vets walked out of each room while the vets still waiting made small talk.

The next thing I knew, the door next to me opened and a veteran exits. I was invited in.

Two women technicians wearing Duke Hospital pullovers greeted me and had me take a seat. I handed my vaccination card to one, who entered it into a computer and verified my information. While she did this, the other asked if I had allergies, gave me vaccine information, informed me of my second shot appointment, and offered to answer any questions.

“Do you know if there are more mass vaccination events like this one going on?” I asked while I waited.

She told me she wasn’t sure but knew that Duke Hospital and other big area hospitals in the area are doing big pushes right now.

Then the tech at the computer turned to me. She filled her syringe, let me pick the shoulder, and administered my shot of Pfizer vaccine. I feel, well … nothing, really. In two seconds it was over, so quickly I don’t think to snap a photo.

“This is more for show since there’s never any blood,” the tech explained as she tried in vain to get a bandaid to stick on hairy arm.

A supervisor or doctor entered the room and double-checks their work, making sure that the dosage information was properly entered on my vaccination card. Someone slapped yet another sticker on me, this one a paper label with “9:40” written on it. This is the time of my shot plus fifteen minutes. Then I was cheerfully directed down the hall to a waiting area where I would spend the next fifteen minutes in case there were immediate adverse reactions. I snapped my first post-vaccination photo here.

Post-vaccination photo

Newly vaccinated!


“Any one with 9:38 or 9:40, you are free to leave,” called out the staffer at the end of the hall. With that, I hopped up, wove my way through the throng of veterans still arriving, and rejoined Travis for the ride home. In an effort to cheer him up, I bought him lunch on the way home.

And that was it. Seems I was in and out like clockwork. I had no pain in my arm and in fact no reaction whatsoever to the vaccine. It was stressed to me that I still must mask and distance as I am still vulnerable until my body’s natural defenses ramp up, but the process has now begun. Data shows that even the first shot alone is capable of preventing serious COVID-19 disease; one shot alone is enough to keep me out of the hospital should I catch COVID. The full protection won’t kick in until two weeks after my second shot, which comes three weeks after the first (27 February for me). Thus, I should be at full protection by 13 March.

What does “full protection” mean? Dr. Fauci and other experts say that vaccinated people can be around other vaccinated people with no fear of infection. Among them, life can go on as if there were no COVID. A vaccinated person with an unvaccinated person are still suggested to mask up as at this date experts are still unsure how much protection the vaccine affords. I expect we’ll soon see further studies which fill in our understanding of this.

Now the wait begins for getting the rest of my family vaccinated. This may take a while but when our turn arrives we will not hesitate to step up. As for me, my vaccination has given me the security to volunteer with NC DHHS to assist with getting more people vaccinated. Before my opportunity for a shot opened up I couldn’t have considered stepping up and helping. Now that’s become possible.

Leaving Durham VA

I also have noticed a change in my mental health, too. I am hopeful and excited again. One of the few things that kept me going though this endless quarantine was the visualization of getting that shot in my arm. I pictured it in my mind on those days when I feeling down and felt like crying. I knew the day would come and if I held on to that I would make it. And so I have.

Now I want to bring that hope and relief to others. Hope is on the horizon! Biden announced this week that 200 million more vaccine doses have been secured, so by this summer anyone who wants a shot can get one. Heck, it might be sooner than that, even.

We can make it! The vaccine is here and within weeks or mere months everyone can be protected. Hang in there!

Warren Myers : remembering sqrt

February 08, 2021 07:16 PM

A couple weeks ago some folks in the splunk-usergroups.slack helped me using accum and calculating with a modulus to make a grid menu from a list.

My original search had been along the lines of:

| inputlookup mylookup
| stats count by type
| fields - count
| transpose
| fields - column

Which was great … until my list grew more than about 12 entries (and scrolling became a pain).

A couple folks here helped me flip it to this format:

| Inputlokup mylookup
| stats count by type
| eval num=1
| accum num
| eval num=num-1
| eval mod=floor(num/12)
| eval type.{mod}=type
| fields - mod num type count
| stats list(*) as *

Which works awesomely.

Unless the modulus value (currently 12) gets too small (if the total list grows to more than modval^2 .. each individual box is no longer in alphabetical order (and then alpha from box to box).

So I made this modification so that regardless of the size of the list, the grid will automanage itself:

| inputlookup mylookup
| stats count by type
| eventstats count as _tot
| eval modval=ceil(sqrt(_tot))
| eval num=1
| accum num
| eval num=num-1
| eval mod-floor(num/modval)
| eval type.{mod}=type
| fields - modval mod num type count
| stats list(*) as *

Dunno if that’ll help anyone else, but wanted to share-back that self-managing aspect I added in case anyone was interested :slightly_smiling_face:

Mark Turner : I was the fox

February 05, 2021 02:41 AM

A fox at Glacier National ParkAt a recent conference, an African American speaker told an inspiring story of an interaction with law enforcement, when he had expected the worst intentions from the officer but his worry proved unfounded. Our speaker had been walking to the local gym after an early-morning run. Soon he became aware that a police car was slowly following him. Immediately he assumed he was being profiled.

“Did you know you were being followed?” the officer asked. The speaker feigned ignorance.

“You were being followed by a rabid fox back there,” the officer replies. “I was just watching out for you.”

The happy moral of the speaker’s story is not to assume bad intentions, see?

When I learned of this speech something didn’t seem right. Then I remembered an incident several years ago.

A few years back, my family and I had been out walking around East Mordecai neighborhood one sunny weekend afternoon. A Raleigh police car drove by and, being the helpful sort I am, I asked them if they were looking for someone.

“Nah, just a rabid fox,” came the reply from the partner. I laughed and we all went on their way.

I am not only the helpful type, I am also the curious type. I pride myself on knowing what is going on in the neighborhood. I had to know more about the fox! Who had seen it? Where did it go? Why hadn’t I not seen any mention of it on the neighborhood listserver? Normally when a fox is seen acting weird in the neighborhood it gets the neighbors pretty excited.

The lack of chatter confused me. I might have even gone as far as checking the call records at 911 center to see what more I could find. But there were no reports anywhere. I determined that the cop had lied to me.

For the longest time, I wondered why a cop would make up a story about looking for a fox. Today I realized that I was the fox.

Mark Turner : Highlights of 2020: Wings of Carolina ground school

January 05, 2021 03:13 PM

In the spring of 2020, Travis and I took a virtual ground school from a local flying club, the Wings of Carolina (WoC). It was the second time I’d gone through ground school, the first one being in the mid 1990s. That time I never got around to taking the FAA exam and I’d hoped to complete it all this time around.

Travis has expressed his interest in becoming a pilot. He has excellent vision and would spend lots of time using my flight simulator. I’d promised him long ago if he completed ground school I would be happy to pay for it, but still he was on the fence. When dates opened up for the virtual ground school, I prodded him multiple times about signing up but he was noncommittal. Finally, I signed myself up with the goal of finally finishing what I started. This was all it took to convince Travis to sign up, too, and we were off!

Twice a week, we would gather in front of our playroom TV to “attend” class. For three hours per night we’d be on Zoom as our instructor, John, filled out formulas and sketches on a whiteboard in WoC’s classroom. About 30 other classmates joined in, too. There were some technical glitches, fewer opportunities to interrupt with a question, and more of a distant feel to it than I would’ve hoped.

There was also a lot to learn. Being that this was near the beginning of the pandemic, shouldering the needs of this course while still panicking about potentially getting sick and going through work changes was a lot to take on. I did the best I could but I found I was not as engaged as I should’ve been. I was happy to see Travis really get into it, though. Towards the end when we were tasked with planning a virtual flight, Travis did his weight-and-balance calculations like he’d been doing it all his life. He arrived at his answers long before the rest of the class (and instructor!) had worked out their solutions. It seemed to me that he had a knack for it. I was so proud!

When it came time for the final exam, though, Travis confessed that he felt he wasn’t ready. Like me, he didn’t feel like he had learned what he needed. I was disappointed but I certainly understood. And so, neither one of us wound up taking our FAA final exams. I believe that to this date there is still a tab open on my mail app for the class exam email.

In hihdsight, I think it would’ve been helpful to slow down when trying to teach a course virtually that is normally taught in person. Classmate interaction suffers in virtual classes, so offering one night each week where classmates could chat together in a virtual study hall would’ve helped. And finally, being much closer to actual aircraft (as we would’ve been had we taken the course in person) would’ve really helped us picture ourselves as pilots. With nearly no one traveling right now, it was hard to consider going through this training and then not using it anytime soon (though now I know that this would actually be the best of both worlds – the convenience of air travel without the hazards of a boarding process).

For Travis’s October birthday, I gave him the gift of an “introductory flight” at a local flight school. When I spoke with him to schedule the flight, to my surprise he turned it down. Well, postponing is more like it. He did not feel comfortable spending even an hour crammed into the tiny cockpit of a Cessna with a flight instructor.

And, he’s right. To do this for 20+ hours while training for a private pilot license would certainly not be safe from what we now know about SARSCoV2.

So, it’s not the right time to take this on. Perhaps this year or next.

Mark Turner : Highlights of 2020: The Election

January 05, 2021 02:50 PM

One absolutely wonderful thing that happened in 2020 was the U.S. Presidential Election. Elections bookended the pandemic for me. In March 2020, I volunteered to be an inside poll observer for the Wake County Democratic Party. This gave me insight into how elections are carried out. Being one of those rare people who have never missed an election, I was already well-familiar with how the process worked from the public point of view but learning more about the various checks put in place was quite educational.

COVID was a thing in the March primary but not taken as seriously. Spending so long in a school classroom turned polling place, packed with dozens of strangers seems like suicide to me now. The general election was far more strict, with volunteers carefully limiting the number of people indoors.

I was also disappointed to be restricted in my movement during the general election. Chief judges would corral the observers into one area rather than letting us do our jobs. After some cajoling I managed to get this largely fixed. I’m sure part of the issue was the threat of violence that was on everyone’s mind due to heightened tensions.

Without exception, though, the interactions I had with the Republican poll observers I spent time with were positive. We had good chats about the state of the community and the country. In the past I would’ve posed for pictures with them but the pandemic made that unworkable.

Joe Biden criss-crossed the country, drumming up votes. One weekend afternoon in the summer of 2020, at the depths of his support at the time, Biden spoke at St. Augustine University. We found about it too late to see the whole speech but Travis insisted in going over there, anyway. He walked over and entered the gym which was mostly empty as everyone had left. Biden was there and Travis simply walked up to him and got a selfie. That means that during the campaign, Travis had his photo with Elizabeth Warren, Bernie Sanders, and Joe Biden (Hallie also got snaps with Warren and Sanders).

I recall Hillary Clinton rallying at St. Aug in 2016 and I couldn’t be bothered to walk two blocks to see her. I should be more welcoming when politicians show up on my doorstep, I suppose!

Trump crimed all he could to keep Biden from winning the election but Trump still lost in a landslide. His campaign team filed 62 lawsuits challenging the results and has so far lost 61 of them. The Rudy Guiliani press conference from Philadelphia’s Four Seasons Total Landscaping will go down in history as a highlight of Trump stupidity.

When news organizations finally, finally called the election in Biden’s favor on Nov 7th I was working in the yard. Suddenly I was startled by fireworks in the neighborhood. Spontaneous celebrations broke out among neighbors – something I had never, ever seen before in my life. I put down my tools and wandered down the street to drink champagne and holler in the street with my neighbors as we cheered democracy’s victory.

Results have now been certified in all the states, the election has been called the most secure in America’s history, and it’s all but over. Biden takes office in 15 days, whether Trump likes it or not. Still, several U.S. Senators intend to object to the certification of the electoral votes. They are seditious bastards in my opinion.

I look forward to mind-numbing normalness from a Biden administration. I look forward to being enraged at Biden over totally minor quibbles. I am happy, though, that a criminal family will no longer be in the White House.

And I hope justice catches up with Donald Trump, his family, and his criminal friends.

Mark Turner : Highlights of 2020: Bermuda sod

January 04, 2021 03:53 AM

I got sick of having a disaster of a lawn. Over a decade ago I had vowed to hang up my hoses and not waste money on grass, but something had to be done. I decided that drought-resistant “Celebration” hybrid Bermuda sod was what we needed.

At the end of summer, I killed all the weeds and grass in our front and back yards. A few weeks later I had a giant tractor trailer deliver 11 pallets of sod. It was pouring down rain when the driver arrived. In his efforts to place mulch in our backyard his forklift quickly got mired in the mud at the end of our driveway. For two hours we struggled to get enough traction to free his forklift, only succeeding when my neighbor Chris arrived to help steer as we pulled.

The damage to the back yard had been done, though. A 6,000 pound forklift cut deep ruts in our muddy yard and those ruts had to be repaired before the sod could be put down.

It took back-breaking, Herculean effort by myself, Kelly, and Travis to repair the yard and get the sod put down while it was still alive. I personally pushed myself past the point of exhaustion many nights. I was a wreck. Surprised I didn’t have a heart attack, actually.

It was a stupid, stupid amount of work. But. We. Got. It. Done!

And it looked incredible! For once we had a strong turf grass that didn’t mind the sun or the shade! It’s been perfect.

With the first freeze, the sod has gone dormant and some weeds have appeared but overall it will look fantastic in the spring when it greens up. I won’t have to do much with it to maintain it, either. So far it’s been a great investment in our home.