Planet TriLUG
is an aggregation of public weblogs written by members
of the Triangle Linux Users
Group. The opinions expressed in these weblogs
and hence this aggregation are those of the original authors.
Planet TriLUG
is not a product or publication of the Triangle Linux
Users Group; as such, it does not necessarily represent the views
of TriLUG as an organization.
Planet TriLUG
is powered by Planet
and is run by Tanner
Lovelace. Mail him with the full address of your RSS feed and
a short description of your affiliation with TriLUG
if you want your blog added to or removed from the subscription
list.
We update every half hour, so having your feed reader check more
frequently than that will just waste bandwidth.
This page was last updated March 24, 2023 09:01 AM (All times are UTC.)
The final day of SCaLE 20x was bittersweet, as I was eager to see more presentations but not ready for it to be over.
Dr. Kitty Yeung
The opening keynote was given by Dr. Kitty Yeung. Dr. Yeung is one of those amazing people who makes me feel completely inadequate. A graduate of Cambridge and Harvard, she has worked in fields as varied as fashion and quantum computing. She is also an artist, and most of her slides were ones she created herself.
A lot of her current work centers on the intersection of technology and fashion. Now I am the least fashionable person alive. Seriously, when I’m not in front of customers I wear the same clothes every day: a black, heavy-weight pocket T-shirt and Levis blue jeans.
I have often thought if I ever did start another company one option would be to create modern tech for older people. Now some people may say that products from companies like Apple are easy to use, but as someone who is often around people in their 80s I know this isn’t true for them. There should be a market for very simple, but powerful, tools aimed at people in this age group. I keep thinking of the Yayagram machine I saw a few years ago as an example.
Dr. Yeung’s work on integrating tech and fashion could be a great interface for these products.
Shifting gears a bit, the next presentation I attended was by Don Marti on privacy.
Don Marti
While it is hard for an individual to balance privacy and convenience in today’s surveillance economy, there are some steps you can take to minimize what personal information you share. I take a number of steps to increase my privacy while on the Internet and this talk gave me a few more tools to use.
One of the things I love about SCaLE is that they usually have an amazing closing keynote. It is cool because you get to end the conference on a high note, and as a speaker it is always nice to have something to keep people from leaving early on the last day.
This year’s keynote was no exception and featured Ken Thompson, one of the founders of Unix and the creator of the Go programming language.
Before he spoke, Ilan Rabinovich gave some closing remarks reflecting on 20 years of SCaLE (which I learned started out as an umbrella conference for Southern California area Linux User Groups).
SCaLE Founders
You can see a much younger Ilan as well as the still very tall Gareth Greenaway in that picture from SCaLE 1x. As someone who as been working in open source for over two decades it just doesn’t feel that long to me, so it was cool to reflect on all that has happened.
Ken Thompson with a picture of him and his siblings
Two decades pales in comparison to the experience of Ken Thompson. He was hired by Bell Labs the year I was born.
He gave us some of the history of his time there and walked us through the creation of what was probably the ur-archive of digital music. In the before times, back when mp3 encoding came out and people worked in offices, some of us would bring in our compact disc collections, rip them and place them in a common archive. Ken’s project pre-dated mp3s and started out as a quest to collect all the Billboard hit songs from 1957. As someone with mild OCD issues, I felt seen when he talked about how that expanded to collecting all the songs (grin).
Of course, digital content isn’t useful unless you can access it, so he modified a Wurlitzer jukebox with a couple of iPads to provide a cool interface, and then, because he is awesome, he bought a refurbished player piano with a MIDI interface so you could trigger that from the same device.
So the best way to sum up Sunday at SCaLE is that you are a lazy bum compared to folks like Dr. Leung, Ilan and his team, and Ken Thompson, who apparently thinks about making a space shuttle out of discarded household appliances while you are watching re-runs of The Big Bang Theory.
(grin)
Hats off to the whole SCaLE team for another great conference, and I’m so happy that it was back in Pasadena. I am already looking forward to next year.
I got up fairly early on Saturday and went through my presentation one final time. When working on a new talk there is a point where the feeling I get when thinking about having to present it goes from anxiety to eagerness and that happened this morning, so I felt ready to go.
The conference started off with a keynote by Arun Gupta, who is a VP at Intel focused on open ecosystems.
Arun Gupta Keynote
His talk was about using open source cultural best practices within an organization, and he used specific examples of how that was being done at Intel. It was the first time I had seen the abbreviation “CW*2” which stands for that Zen quote about “Chop wood, carry water“. While that phrase has a lot of different meanings, when applied to open source it references the idea that as a member of an open source community one should not only just focus on the high profile aspects of the project but also the more mundane ones that actually keep the project alive.
After the keynote it was time for my presentation. I was originally scheduled to speak on Sunday morning but due to a conflict I got a spot on Saturday. I was grateful as I like to get my responsibilities out of the way so I can enjoy the rest of the weekend without worrying about them.
Me at the end of my presentation (image yoinked from Zoe Steinkamp’s LinkedIn feed)
I did a talk on open source business models and how things have changed in the past decade or so. My “hook” was to do the presentation in the format of an old school text adventure.
It was fun (and yes, there was a grue reference). It seemed to go over well with the audience and there were a number of great questions afterward.
With that over I decided to walk down the road to grab lunch when I ran into Gareth Greenaway. Gareth was one of the original organizers of SCaLE and it was cool to be able to catch up. He is currently doing some amazing things over at Salt.
SCaLE always has a wonderful hallway track and I also got to see John Willis. I had not seen him in years although we used to cross paths much more frequently and it was nice to be able to catch up. He is a co-author on a new book called “Investments Unlimited” which chronicles the DevOps journey of a financial institution.
I also had some time to wander around the Expo floor. I try to minimize the amount of swag I bring home but I’ve started to collect those little enamel pins that some people give out.
Enamel pins on my backpack
Tha AlmaLinux pin was given to me by the amazing benny Vasquez who was spreading the word about their project which helps fill in the gap left by the CentOS project migrating to CentOS Stream.
Me and benny Vasquez
This year I spent a lot more time in sessions than I normally do as they were just so good. Many times I found myself having to decide between three or more talks that occurred at the same time.
One that I didn’t want to miss was given by Zoe Steinkamp on using InfluxDB to monitor the health of plants.
Zoe Steinkamp
I spent much of my professional career in observability and monitoring so I have a soft spot for unique applications of the technology. Zoe uses sensors to feed information about humidity, sunlight, etc. from her houseplants into InfluxDB so that she can use that information to maintain them in the best of health. My spouse keeps koi and I do something similar to monitor water temperature.
The next presentation I attended was on the Fediverse. Now I have never been much of a social media person, and last year I deleted my Twitter account which left LinkedIn as my only mainstream service. I do have a Mastodon account and with the recent migration of a lot of people to the platform I do find it useful, although I don’t spend nearly as much time on it as I did Twitter. I think it has a lot of potential, however, and what it really needs is that killer app to make it easier to use.
Bob Murphy presents on the Fediverse
Bob Murphy did a great talk on how the Fediverse is not Mastodon, and he introduced me to a number of other services that use ActivityPub, which is the underlying protocol. For example, there are sites that focus on image as well as video sharing, not just microblogging. Speaking of blogging, Automattic (the company behind WordPress) announced that they acquired the makers of an ActivityPub plugin to bring the technology in-house and it seems like they plan to make it a core part of their app.
The final talk I attended was given by Michael Coté. I’ve known Coté for over two decades back when he lived in Texas and it was nice to see him again (he’s living over in Yurrip these days).
Coté on Developer Platforms
As usual, he provided some great insights on what he is calling “platform engineering” (think DevOps mashed up with SRE).
After the talks were over I met up with some friends for dinner. Now I am a fan of the television series The Big Bang Theory. It is set at Caltech which is located in Pasadena, and there is even a street named “The Big Bang Theory Way” (my picture of the street sign didn’t come out, unfortunately). During the weekend I kept hearing people talk about a place called “Lucky Baldwins”. I thought it was a joke since the character of Sheldon in the TV show makes a reference to the place in an episode called “The Irish Pub Formulation” but it turns out it exists.
Lucky Baldwins
We stopped there for a drink and ended up staying for dinner. It was a nice ending to a busy day.
I spent Friday morning practicing and working on my presentation, but managed to make it over to the conference just before lunch.
SCaLE 20x Sign
I was really impressed with the “steampunk” graphics for this year’s show. They were cool.
Check-in, as usual with SCaLE, was a breeze. They have automated most of it. You walk up to a bank of computers, choose one and then enter in your registration information and your badge gets printed. I also think you could purchase a registration through the system as well.
Then you walk down to a table to get your conference bag, badge holder and lanyard.
After wandering around for a bit I went down the street to meet up with Aaron Leung. While I love many things about being able to work remotely, I do miss meeting people in person and especially people I work with at AWS. Aaron happens to live in LA and he was kind enough to come out to see me and we had a great lunch.
Having SCaLE back in Pasadena was awesome. Not only is the convention center nice, it is really close to a ton of restaurants so you have a bunch of options for dining. The only downside was that it was raining (you can see the folks with the umbrellas above). When I had to go outside it wasn’t bad – more of a mist – and it was strange to have rain in LA. It did make the hills very green, however, and quite the departure from the usual tan.
After our long lunch I worked some more on my presentation, and then headed back over to the conference. The Expo floor was open so I spent some time wandering around and looking at the booths.
The “forgotten” operator in the title refers to people tasked with running on-premises data centers. Now I’ve been in a number of data centers and they were all has he described: racks upon racks of 1U and 2U servers arranged in rows, some with “hot” aisles and “cold” aisle and each server with a pair of power supplies and lots and lots of cabling.
I have never been inside a Google or Amazon data center, but I’ve always imagined it to be more along the lines of the one Javier Bardem’s character set up in Skyfall.
Picture of a data center from the James Bond film Skyfall.
In these days of the “cloud”, compute is divorced from storage and so a lot of the hardware in an old school 1U rack mount machine is unnecessary. Plus there is the antiquated idea of having separate power supplies for each board in the rack. Computers run on DC power, so why not just supply it directly from a central source vs. individually? I started my professional career working for phone companies and everything was DC (many central offices had a huge room in the basement with chemical batteries – and, yes, it did smell).
When I started my own company 20+ years ago I had two Supermicro 1U machines and when I turned them on they were each louder than a vacuum cleaner. Bryan told us that their racks are whisper-quiet (well, once they are powered on and the fans on the rectifiers spool down).
I’m oversimplifying, but that is the basic idea behind Oxide. They want to supply cloud-grade computing gear to enterprises and break the old paradigm of what a data center should look like. Users can still leverage cloud technologies like Kubernetes but on their own gear. It still doesn’t solve the need to have people who understand the technology on staff, but it was exciting in any case.
Lightning talks are 5 minute presentations consisting of a set number of slides that advance automatically. I’ve never given one, and once when I mentioned that I thought it was cool it was pointed out that I can’t introduce myself in five minutes, much less give a talk. (grin)
I was impressed with the presentations. One that stuck out was the fact that the term “open source” as formalized by the Open Source Initiative is now 25 years old. Wow.
After Upscale a group of us went down the street for dinner and drinks. I can’t emphasize enough about how much I miss the face-to-face aspect of in-person conferences and I hope we can continue to have them safely.
Today I left for Pasadena and the 20th iteration of the Southern California Linux Expo.
Me on a plane
SCaLE is on of my favorite events of the year, and I’ve been coming (for the most part) since SCaLE 5x.
This year I’m giving a presentation on open source business models, and I’m pretty happy with how it turned out.
I didn’t get to attend any of the sessions or activities on the first day, but I did manage to have dinner with some friends including Ilan Rabinovich, who is one of the main organizers of the event, and Stephen Walli, who works on the open source team at Microsoft. I also got to meet for the first time Amye Scavarda Perrin who is a program manager at the Cloud Native Computing Foundation.
Ilan, Stephen, myself and Amye
While I think virtual conferences have a lot to offer in the way of education, I really do miss these opportunities to meet face to face and to interact with interesting people. I’m hoping that in-person events become more common in 2023.
Not to misquote the Beatles, but it was 20 years ago today that I posted my first entry to this blog.
By 2003 blogs were pretty popular so I was somewhat late to the game. My friend Ben Reed had a blog that he used kind of like a proto-Twitter where he would post many times during the day on what he was doing, which at the time focused on porting KDE to MacOS. Back then a lot of open source projects used blogs as a communication platform and since I was maintaining an open source project I figured I should start one. He used Moveable Type as his blogging software so I did as well.
Moveable Type was very popular back then, but when they started to move their licensing to a more proprietary model, people were turned off and migrated to WordPress. I find it delightfully ironic that WordPress, which is open source, now forms the basis for around 40% of all websites whereas people have probably never heard of Moveable Type these days.
If there happen to be any younger readers here, blogs twenty years ago were like podcasts today: practically everyone had one. Also like podcasts, most were sporadically updated, which is why Really Simple Syndication (RSS) became important. RSS is a protocol that lets you find out when websites are updated. Using a “news reader” like Google Reader, you could aggregate all the websites you were interested in following into one application. It was pretty cool.
But then along came social media sites and what people used to post on blogs they started posting there instead of on their own sites. Even with a lot of hosting options, running a blog is incrementally harder than posting to, say, Facebook. In 2013 Google killed Reader which pretty much ended blogging (although I still use RSS and find that the open source Nextcloud News is a great Reader replacement).
But I’m old and stubborn so I kept blogging. In fact I think I have something like five or six blogs that I update periodically. I use another blogging technology called a “planet” to aggregate all of those blogs so my three readers can easily keep up with what I’m doing.
Another thing that social media brought about was this idea of engagement. People still look at metrics such as number of followers as an indication of how far a particular post reached, and even when I started this thing folks would brag about their stats. As a contrarian I took the opposite approach and decided that I’d be happy if just three people read my posts. I got a chuckle the first time someone came up to me and said “hey, I’m one of your three readers”. Made the whole thing much more personal.
And to me blogging is personal. I love to write and the best way to become a better writer is to do it. A lot. I really wish I had more time to post but between my job (which involves a lot of writing) and the farm it is hard to find the time. As someone who loves the culture around open source software, sharing is key and I hope some of the stuff I’ve posted here has helped someone else as so many other blogs have helped me.
That’s about it for this update. I would promise that I’ll post more often and with better content in the future but I don’t like to lie (grin), and in any case thanks for reading.
While I love living “out in the country” I often envy my urban friends for their network connectivity. When I moved out to the farm in 1999 the only “high speed” access was satellite, and even that required a modem and a phone line. I was overjoyed when Embarq finally deployed DSL to my house, and while 5bps down might not seem like much these days it seemed heaven-sent back then.
Jump forward 20 years and Embarq became Centurylink which is now Brightspeed. I had a pretty poor opinion of Centurylink (or as I called them CenturyStink) but high hopes for Brightspeed when they bought Centurylink’s ILEC business in our area, but they have been disappointing. Here is that story.
Both my wife and I work from home, and when our DSL circuit is working it works well enough for us to get our jobs done. At 11Mps down and 640kbps up it doesn’t even qualify as “broadband” but it is a trade off I’m willing to make in order to live where I live.
Starting back in early November we began to have issues with the quality of the DSL connection. Quality issues are always frustrating since the support technicians at the provider never seem to have the tools to properly measure it. Instead they just tell me the circuit is “up” so I should be satisfied, even though I tell them that while it is up, it is unusable.
The issue was high latency and packet loss. Latency is a measure of the time it takes information to travel through the network and packet loss indicates that some of that information never makes it to its destination. The protocols used in networking will automatically deal with packet loss by sending the information again, but the more this happens the worse the experience is for the user. Things that can handle packet loss gracefully like e-mail, web pages and chat just seem very slow, while anything that requires a more steady flow of information like video or gaming just don’t work at all.
Having done network monitoring for much of my professional life, I monitor the quality of my DSL circuit by attempting to reach the 8.8.8.8 IP address, which is a highly available DNS server run by Google. Here is a recent graph:
Now normally the graph should be green and pretty much focused around 45ms. This one was all over the place. I asked my neighbor to execute a ping to the same IP address and her connection was working fine, so I assumed it was an issue specific to me.
Trying to get support from Brightspeed was very frustrating. As I mentioned above they just tend to tell me everything is okay. I even reached out to one of my LinkedIn contacts who is an executive vice president at Brightspeed for help, and I think he was responsible for a ticket being opened on November 7th in the “BRSPD Customer Advocacy (Execs)” queue. I really appreciated the effort but it didn’t help with getting my issue resolved.
Just after Thanksgiving I called again and I was told that the problem was that my modem was wrong and doesn’t work with the DSL circuit we have, even though it was the model they sent to me and it had been working fine up until November. In any case they said they would send me a replacement and it would arrive in two days.
Ten days later I call back and ask, hey, where is my modem? They told me it was still “in process” but since I’ve been waiting so long they’ll overnight one to me. It shows up the next day and of course doesn’t fix the issue, but it has newer software than my other modem and reports the status of the DSL circuit. On the web page of the device this is usually represented by the word POOR in dark red, but sometimes it would improve to MARGINAL in a slightly lighter red. I call and explain this to Brightspeed, and after dealing with this for over two months they agree to send out a technician in two days.
When I finally get the e-mail confirming the appointment, it is for the following Thursday, January 5th, eight days away. That also happens to be when we are closing on a new house, so I can’t be here to meet with them. For some appointments they show up early, so I didn’t change it right away, but when they hadn’t shown up by Tuesday I decided to reschedule it.
I went to the e-mail and clicked on the link to reschedule and got sent to the Centurylink site. Of course they wanted me to confirm my account, but nothing I typed in: phone number, e-mail, or account number, worked because I no longer have a Centurylink account. (sigh)
[Note: it looks like this has been corrected, finally, on the Brightspeed website. Not sure about links in e-mails]
In the process I did find out that my Brightspeed account number ends in “666” so perhaps that is indicative of something.
I eventually ended up calling support once again. I believe it would take the average caller about six minutes to reach a human through their system, as it prompts you for a variety of things before allowing you to speak to a person, but I had been calling for over two months so I can speedrun the thing in about four and half minutes by pressing buttons before the prompt is finished.
The person I talked to about rescheduling the appointment kept me on hold for about 30 minutes before telling me that the whole dispatch system for technicians was down and that she would call me back in two hours. She never did.
The next day I made one more attempt to reschedule the appointment, but was told that the next available appointment was so far out in the future that I should just keep it, since the technician won’t need to enter the house. I left a long letter taped next to the demarcation box on my house with a detailed description of the problem, and hoped for the best.
Unfortunately, they sent out Brandon. To my knowledge there are only two technicians assigned to our rather large county: Brandon and Elton. I much prefer working with Elton since Brandon doesn’t really seem to be the kind of person who does a deep dive into the problem, but I recently learned that Elton has moved into the back office and wasn’t doing service calls anymore.
As I feared, Brandon marked the issue as closed without fixing it. Once again into the support phone queue, where I was told that he had run a test “for five minutes” and my circuit was fine. (sigh)
I did get a text asking if my problem was resolved, to which I said “NO!” and I was later contacted by a person from Brightspeed to follow up. After a very long conversation she offered to send someone else out, and that person arrived yesterday.
Philip, who is based out of Wake County (one county to the east of us) showed up promptly at 8am and within ten minutes had diagnosed a grounding issue with the wires coming to our house. In about 45 minutes he had repaired it, but he warned me that there was also an outage in the area which would explain my now 900ms ping times (but no packet loss). I trusted him that it would eventually resolve and about 30 minutes later things were much, much better.
You can see where the network was bad before Philip showed up, the gap where he was working on the system, and then the return to a more expected quality of service.
It still isn’t perfect. I’m seeing a lot of jitter from time to time which is indicated by the spikes, but for the most part the user experience is fine. I was able to participate in our departmental weekly video call without issue yesterday for the first time in months.
And that’s was really bothers me the most. For nearly three months Brightspeed was gaslighting me that my service was fine, when, as most IT professionals would expect, it turned out to be a physical layer problem. In retrospect it makes sense since we’ve been having an especially wet winter and that would have caused the grounding issue to be amplified.
I figure I spent between 40 and 60 hours actively involved in getting this addressed, and that is time I’ll never get back.
Of course it could be worse. The local newspaper published a story about a community in Chatham County that was without service from Brightspeed for a total of 51 days. At least our connection was usable enough that it only required a few trips to the public library for access during important deadlines.
There is some good news in that same newspaper issue that some attempts are being made to help those of us in rural areas get broadband. Some of you may be thinking Starlink, but I was on their waiting list for two and a half years without getting my equipment and when they pushed it out to late 2023 I just gave up and asked for my deposit back.
I am not a huge “we need to regulate everything” kind of guy, but broadband has become one of those services that is so important and that the free market has failed to provide that I would welcome government involvement in getting this issue addressed. But so far the communications lobby has been strong enough to prevent any kind of oversight, so I won’t hold my breath.
I’m still around, doing tech things, but been busy with work and family.
Posting this quick update as I’ve made a theme change (my old theme was no longer supported and breaking) but it’s not to my liking yet. I’ll fix it when I have a little more down time.
I’ve also moved my wordpress to a hosting provider as I’ve sadly given up running my own co-located hardware. The email world finally bullied me out, a story for another time. This also means I’m finding broken links in the site that were referencing old static content outside wordpress that I need to go back through and recover. Another task on the list.
The Splunk command fieldsummary is amazing – I use it quite frequently to explore more “new” (to me) sourcetypes, and to find out about more fields than I’ve previously used in the sourcetypes I work with most.
But sometimes you want to be able to delineate more granularly than fieldsummary will allow.
Maybe you have a single sourcetype that happens to have a couple variations (Forescout CounterACT data is like this (it’s all JSON, but there are ways to distinguish events based on the field ctupdate)).
index=ndx sourcetype=srctp <field_to_split_on>=*
| fields - _raw index sourcetype
| foreach *
[ eval <<FIELD>> = mvindex('<<FIELD>>',0) ]
| stats latest(*) as * by <field_to_split_on>
| transpose 0 header_field=<field_to_split_on>
| rename column as field
Run this in Verbose mode over a long enough time window to capture what you want to see (at one customer, I could pick earliest=-20m and have an ample sample).
I’m removing the fields _raw, index, and sourcetype because I “know” the index and sourcetype, and _raw just isn’t that helpful in this context.
Ever since my parents got older, I’ve be wanting to create a tech company focused solely on making technology available for the elderly in a fashion that is easier for them to understand. Perhaps this will all go away with AI and digital assistants, or when my generation that grew up with tech gets older, but I have watched them struggle sometimes with mobile phones and TV remotes, even ones that are supposed to be simple, and I realize there is probably a market for such solutions.
While I don’t have capital for such an undertaking, I do have access to open source software and hardware, and I have a device idea that shouldn’t require heroic effort to create.
Remember the “Easy Button” from Staples?
Staples Easy Button
What I want is something about the same size, but when you press it, it will send a notice to an app on my phone.
My mother died last year and we recently bought a new home in part because it has a basement apartment where my father can live. He will have his own space but we’ll be close enough to help him out if he needs it.
The idea for this button came to me when I was thinking about what would happen if he needed some help but for whatever reason he couldn’t either call out so that we could hear him or get to a phone. Unless he was severely incapacitated he should be able to press a big button, and since I almost always have my phone with me all I would need would be an app that could send me a notice.
Since my three readers are very smart (and not to mention devilishly attractive) you have probably thought about existing services (remember the “I’ve fallen and I can’t get up” Life Call ad from 15 years ago?) but older people can be extremely proud and they hate being reminded of their age. I’m pretty certain my father would resist carrying around a device on a lanyard but he wouldn’t mind having a button nearby “just in case”.
The feature set would be pretty short:
A big button (‘natch)
Some indication that the button has been pressed (light or buzzer)
Settings:
A way to name the button
Configure the Wi-Fi connection
Configure a list of users to contact when the button is pressed
For ease of use the first generation of such a device would not be battery powered but if it was there would need to be a way to make sure the battery was charged.
While I have worked with Raspberry Pi boards I have not done anything with the Pi Zero, but I assume this would be a perfect application for it. The original Easy button could be repurposed for this device but there are also a ton of options on Amazon that could work as well.
A bit harder would be the app software, as I am lead to believe getting notices in the background on mobile devices can be tricky and I don’t want to have to have the app running all the time. I have enough skills that I could make something that would send an e-mail (which would remove the need for a separate app) but I’m hoping for a solution with more reliability and less latency. It would be nice to have it send a notice no matter where I am but if it were easier to only work when my phone was on the same local network as the button that would be acceptable (I’m trying to figure out a solution that wouldn’t involve a server).
Anyway, just putting this down as a placeholder for when I have some free time to pursue it, but I also figure someone may have done this already and by posting this I’ll find out about it.
I am not the biggest social media user but I am also no slouch. I am proud that MarkTurner.Net predates Google and nearly every other commercial social media site still around. I wasn’t the very first to use Twitter but I was one of the first. I joined in 2008 back when it was a bumpy ride of a place with frequent failures. As it grew and I was astonished to see that actual revolutions were being built around tools like Twitter, I became hooked. I could follow the accounts of notable people and get the facts straight from them. I could turn to Twitter whenever something important was happening in the world as there were bound to always be a play-by-play available from somebody there. I built up a whole list of people I followed and a much smaller list of people who followed me. It was amusing, educational, and community-building.
Then Elon Musk showed up. In a fit of insanity, he bid for the company and then desperately tried to back out of the deal. Finally, he took the helm, fired all the people who knew how to run it, let the Nazis and traitors back on the platform, and promptly made other erratic decisions that not only killed off any trust anyone had in the platform and community but also killed Musk’s reputation as a supposed genius. As a result, Twitter was cripped with outages and failures, loyal users were alienated, and the stock of Tesla (which was used as collateral for the Twitter purchase), swiftly tanked. Musk ended the year as the person who had lost more money than anyone else in history.
What to do about my account and it’s 16,000 posts? I didn’t see many viable alternatives to Twitter. There is Mastodon, though it is quirky enough to keep people from adopting it. I could stay put at Twitter, but I really don’t want to offer any support to Musk’s right-wing Twitter adventure.
I made the decision to stop updating my Twitter account and move to the Fediverse (where Mastodon is the most popular platform). I joined a few Mastodon servers (the first in November 2017), before deciding I’d set up my own server using a Mastodon cousin, Pleroma. I have my full Twitter archive downloaded and could easily add it to my “toots” or posts, but for now I prefer to be starting off from scratch. Though I no longer update it, I will keep my Twitter account alive to tune in every now and then on the chaos over there. I will also use my developer account to mirror some interesting Twitter accounts over to my Pleroma account. That way I can continue to follow the accounts I enjoy following while not having to log into Twitter to do so.
Mastodon/Pleroma and the Fediverse aren’t perfect, certainly. But I was a Twitter user back with Twitter was still shaky. I have confidence that the Mastodon/Fediverse community will work to figure things out my experience will continually get better.
The kids have been home from college for the last few weeks on their holiday breaks. It has been wonderful having them home again, with lots of catching up, games, hikes, jokes, and just hanging out. I know how my parents felt when I returned home back in my college/military days. There’s a special comfort knowing they’re close by. I would walk by their doors in the morning (and sometimes the afternoon), smiling at the knowledge that they were home.
The past few days have been tougher, sending them back to their studies. Hallie packed and left on Friday, bound not for Chapel Hill but for a semester interning in DC. She’s excited to be starting a new adventure and Kelly and I are both excited for her and proud of her.
We had most of the rest of the weekend with Travis, though he also packed up this morning and I drove him at 10 AM to meet his carpool buddy for the trip back to Asheville. He is doing well in his studies and the interests he has picked up.
Now it’s just Kelly, me, and the dogs, and the quiet is settling in. I’ll miss the lights left on, the dishes strewn around the kitchen, the constant loads of laundry, and even the late night kitchen raids. Those things that once annoyed me now bring me comfort. It’s a reminder of the routine we’ve had for so long.
I know our jobs as parents are to get them out on their own, and we’re mighty damn close to having done that. Yet it’s still good to be remembered and to feel needed. I guess the beauty in the building of self-sufficiency is when they come back even when they don’t really have to. I’m already looking forward to our future visits.
Some of you may know this about me, but I have had lots of experiences in my life that cannot be explained by mainstream science. I have mentioned before how I’ve learned remote viewing but I haven’t talked much about my UFO and ET experiences here. 2022 was the year that I finally shared many of these stories with others. I decided “fuck it,” they really happened to me and I couldn’t possibly bear possibly getting hit by a bus someday and not having shared these with anyone. Not that I’m ready for any buses to come careening towards me, mind you. I joined an Experiencer group called CERO this year that consists of people like me who have had an ET experience (a.k.a. “Experiencers”). Much of my memories of those events has been vetted by others who have had similar experiences. I’ve also been happy to help fill in the blanks for others by adding my experiences and thoughts.
ET experience is still one of the very most taboo subjects. When it happens to you, you can feel like the only one on the planet who’s had this experience. Not only do you feel incredibly lonely to find yourself the sole human in a roomful of aliens, you are lonely afterward because you don’t have anyone you can tell about it. Finding people who can relate is a huge, huge blessing because otherwise you can think you’re going nuts.
I felt comfortable enough to go on podcasts with some of my experiences, such as Cameron Logan’s “This is my Alien Life” podcast and my friend Nicolle Morrock’s P.E.E.P. podcast (episode 46). I also sat down in November for a video interview with Bill Howard for his upcoming documentary on Experiencers. We spoke for nearly two hours! Those who have seen the teaser clips have praised me for my candor and humor about it. I really look forward to the release of the whole documentary, coming to the Reveel streaming service soon.
Did it happen? Yes. Will some of y’all think I’m nuts? Yes. Do I care? Not in the least. I call ’em like I see ’em and I am not about to pretend that what happened to me didn’t happen, especially if it’s only to keep people from getting uncomfortable. Do I think I’m special? Sadly, no. You’d honestly be shocked at how many people this happens to. If I’m special, it’s only because I’m one of the few people who:
recognizes it for what it is, and
is comfortable talking about it.
On this note, late last year I published a blog post publicly that had composed eight years prior, couching my alien experience in terms of a dream. You can read it here.
Speaking of public speaking, I got the opportunity to dust off some public service skills this year, including public speaking and organizing. Earlier last year, one of my friends currently serving on Raleigh’s Parks Board invited me as a former Board member to join the board’s Fred Fletcher Award committee. I agreed and also brought along my friend Scott Reston. We spent a few Zoom meetings collaborating with other committee members to weigh the contributions of so many of Raleigh’s parks volunteers and make the hard decision on whose work deserves an award. The Fred Fletcher Awards Ceremony was always one of my most favorite events and I was thrilled to help make it happen again. I was beaming when I took the stage on MAy 10th to give a speech honoring my awardees. I didn’t miss a beat and it rekindled nostalgia for my days of civic leadership.
Then this fall, I dusted off my CAC skills with a one-off meeting of the old East CAC. I was approached my my friend and former CAC chair, Lynette Pitt, about getting another East CAC meeting done. There were neighborhood concerns about crime and development which weren’t being addressed. I met with her and another former CAC official to get something planned. In October, we held a meeting at the Lions Park picnic shelter and several dozen folks showed up. We even had free food donated by Brookside Bodega. We heard from the community on what issues need addressing and made plans to meet again to address them. I look forward to working with Lynette to make sure our community needs are addressed.
Late in 2021, I was reminded about the North Carolina Reading service (formerly the Triangle Area Reading Service), either in a news story or a friend’s mention. I decided to dust off my speaking skills and get back in there as a volunteer. I got back going in as a fill-in volunteer, and for the first time in ten years I was back behind the microphone in February reading a shift. Since then I’ve gone on to read about a half-dozen times over the year, doing the evening shift and reading USA Today. I have been mostly diligent in saving the resulting recordings off the website, and listening critically to how I sound. This has helped me considerably with my public speaking skills and I feel good about performing a service for the sight-impaired folks in my community. It’s always been fun!
I love to sing. Just love it. And at times I have needed to scratch the itch of public performance that I haven’t been able to get through DNR. So, when I feel up to it, I’ve been going out to the various karaoke shows around town to get my “sing on.” My favorite shows are Steve Scott’s show at the Brickhouse on Hillsborough Street on Friday nights. Steve’s shows have some really good regulars and a crowd that seems to appreciate good singing. Beyond that is Harryoke’s show at the Raleigh Beer Garden on Monday evenings. Another good crowd that appreciates good singing, and Harry has a good selection of songs. I also go to Jacob Sobel’s shows here and there, though his show at Picked Pub in north Raleigh seems to have been handed off to another KJ. So much depends on the audience’s response for me, and there are some venues where my songs or style just don’t click.
For a few weeks this fall, I tried a new show just starting out on Wednesday nights at the Dueling Piano Bar on Glenwood Avenue. It’s a first-rate club with lights, stage smoke, and a decent sound system. The crowd was small but into it, with a lot of good singers. The club had a karaoke contest each week, awarding $100 to the best singer. I competed twice but never won and got frustrated with the process, so I stopped going. I may revisit it later, though.
In Randleman this summer, I drove miles north to a club called Kamikaze’s Tavern just on the edge of Greensboro. It was mostly empty on karaoke night but I blew people away with my performance. Some regulars actually asked me “you mean, you drove all the way to sing here?” Ha! Not quite, but I totally did enjoy performing and I hope to get back there again someday either for karaoke or with my full DNR band.
2022 was the year Kelly and I became empty nesters. We’d gotten Hallie settled at Chapel Hill in 2020 and for a while Travis was a virtual only child at home. In March, he and I went on college tours to UNC Charlotte (taking Amtrak) and to Appalachian State. He liked App more than UNCC but the small class sizes of UNC Asheville appealed the most to him. He graduated Enloe with honors in June, spent the summer working for a short while, and got settled in at UNCA in August. He’s been thriving there! He is delving into engineering classes there, making friends, playing on the school’s ultimate Frisbee team, too. He also makes time for bike rides and learning the banjo in his spare time.
As for Hallie, she spent the spring semester in Montana doing field work in environmental science outside of Glacier national park. In-between sciency stuff, she would go on miles-long hikes in the mountains and camp out. It’s been wonderful watching her love of science blossom, though I think she prefers field work to lab work! After a short break at home, she left in the summer for a semester in Highlands, NC doing more field work. She was studying the effects of stream pollution on the hemlock trees in the area and became an expert on taking core samples. More hiking, camping, and waterfall exploration took place, in addition to road trips to Tennessee to go see bands. Hallie was initially unsure she would like her time in Highlands but was soon loving it and was sad when it came to a close. I don’t doubt there will always be a place in Hallie’s heart for mountains and I love seeing it.
So, how have Kelly and I been using our time as empty nesters? We took a week to stay at a riverside cabin in Randleman, NC back in June. I was jobless by that time but we didn’t let that ruin our break. We paddled on the river, paddleboarded on the local lake, and did lots of hiking in a North Carolina forest I’d never explored before: the Uwharre National Forest. I loved our little trip there and want to explore more of all the parks in North Carolina I have yet to see.
Beyond the Randleman trip, we’ve not done much. The flaw in our Empty Nester plan is that we may not have kids but we do have two dogs. Boarding two dogs for a week turns out to be quite expensive. This tends to put a damper on any spontaneous plans to get up and go somewhere. In fact, last month Kelly took the kids up to visit her parents while I stayed home. Boarding the dogs while I have no income didn’t seem to be a prudent use of our money.
This was the year that I put my money where my mouth is and actually got going with a band, DNR. I had first auditioned on December 30, 2021 but met the full band at my first rehearsal on January 15th. It was awkward for me at first because as the frontman / lead singer the band would often look to me for direction on what song to work on next. It was my first band and my first freaking rehearsal, so I really didn’t have a clue what I was doing. I just winged it, though, and figured it out as I went. I’m the noob in the band and the youngest, with the rest of the group having a decade or more of playing. They make it easy for me to fit in, though, and we’ve spent hundreds of hours of diligent rehearsing to perfect our sets. We played three private parties in 2022, which was great experience to be out in front of an audience, but I’ve always hungered for more! We have a dozen or so gigs lined up for 2023 at local bars and breweries and I can’t wait to get out there and entertain folks again!
I learned on December 21st that my friend and shipmate, Matt Feath, died after some recent heart surgery. Matt was a flaming liberal in a military uniform disguise when we served together on the USS ELLIOT. I kept my politics to myself when I served and he did as well, but once he had retired and I Was out I really began to appreciate him. It is a lonely thing to be progressive in a conservative culture like the military. I assume Matt did it for the same reason I did it: purely out of love of country, and not because of some forced, fake patriotism that drives some on the right. He and I traded messages on Facebook every few days, swapping takes on national politics, good bands and musicians, military life, and mutual interest in spooky topics such as UFOs, remote viewing, and the like. I was delighted that our friendship has blossomed the way it had. I smiled at all the photos Matt shared of time he spent with his kids, on whom he absolutely doted..
Matt was very loyal to his family and friends. He helped me find my way as a liberal veteran – that is, a former military member who actually gives a shit about others. He introduced me to a lot of good music, too. He was an atheist for much of the time that I’d known him but I suspect some of the woo woo interests I’d shared with him may have made him think that perhaps there is something more to material life. When I eventually cross the border into the next world, I look forward to sharing a beer with my good shipmate, Matt.
Twenty twenty-two was the year that COVID came home to roost at the Turner household. After masking nearly everywhere, Travis went to his high-school chorus rehearsal unmasked on April 6th and the next day was sick. He spent the next week or so coughing and hardly leaving his bed. Kelly and I masked and tended to him and she and I dodged the bullet.
It was a little over a month later that it was Kelly’s turn to get COVID. She tested positive on May 15th and was sapped of energy for about a week. Kelly isolated in our guest bedroom while I took care of her. We’re not sure where she picked it up: possibly a work event or a social event.
Hallie is the latest to come down with COVID when she tested positive on December 22nd. She had dodged COVID when it affected her Carrboro roommates, drove for hours with her sick friend Jonas as they traveled the state, and kept healthy during her semester at Highlands with the exception of a short bout with the flu. She isolated in her bedroom here at home for over a week, always answering a sad “not good” when I’d asked how she was feeling. Only after Christmas did she seem to start feeling better. We all masked up in house and I kept the HEPA air filter running day and night and thankfully we are all healthy again.
I have never tested positive for COVID, neither PCR or rapid antigen tests. That is not to say that I dodged COVID because I think the odds of that are pretty small. Still, I have never stopped masking up in public places. I got another COVID booster vaccine in May and the bivalent booster in September. Yet, in spite of the negative tests I have noticed the “COVID toe” phenomenon appearing occasionally. And the day before I thought to test Hallie, my right shoulder began aching – the same one that I got my last COVID shot in. It always makes me wonder that I’ve been exposed to COVID but perhaps my body is very good at fighting it. Who knows? I am adamant on staying healthy and that’s that.
The biggest challenge of 2022 for me personally was job changes. Up until June, I had a job that I loved at Pattern Health, working with old friends at a startup and building new skills. It was about all I could ask for until one day in June when my boss called me into an impromptu Zoom meeting to tell me the board was cutting the workforce and I had been laid off. My manager, John, is a good friend. He was as surprised and saddened as anyone and I know it must have been tough for him to do it. I didn’t take it personally – startups don’t always succeed – but it is never convenient to have to look for work. The silver lining is that within two days, I had 42 leads from my huge network of friends. I had to make a spreadsheet to keep up with all the suggestions.
After a vigorous job search I wound up getting three job offers. I politely turned the first one down. The team was really nice but the role itself was largely Windows-focused and I didn’t want my Linux skills to atrophy. The second was for a local startup company that would’ve gotten me (or at least one foot of me) back into sales engineering in addition to DevOps. I considered it but turned it down as well since I didn’t think I clicked with upper management and there were some things I saw that raised some ethical questions.
I wound up taking the third offer even though it was significantly lower than the other two. It was for an even tinier startup because the company was intriguing and I would be working a four-day workweek. I felt good about working for a company that claimed to want to give back, and was looking forward to having some extra time to devote to my own projects. Well, it turned out the company didn’t really live up to its values. Material information was withheld until after I’d accepted the offer. My manager did not play well with others, alienating me as well as the developers we ostensibly were serving. The DevOps work I had hoped to be doing never materialized. The four-day work week? Well, they didn’t really mean that. I was relieved when we parted ways. To think that I had chosen this employer based on ethics is laughable now.
So, 2022 is in the books. It was another year on the planet, another year of learning, and for that I am grateful. Still, a few of the lessons of 2022 were pretty shitty ones, overall. Even so, there were great achievements in 2022 as well. So here’s a recap. This year I will mention the things that didn’t go so well right at the start, so that I can focus on the things that did go so well. Part of my focus for 2023 is to celebrate the good things, of which there are many and to which I don’t typically give proper credit.
Today marks one year since I decided to stop drinking alcohol. I can’t say I really planned to get here. It started out as an experiment to see how abstaining would affect my health. I figured that I would probably sleep better and feel batter about my health if I stopped drinking. I was not a heavy drinker. I usually stopped at one drink and can’t remember a recent time where it was ever more than two. Still, I had gotten into the habit of having one drink in the evening and that over time would add up.
One thing I asked myself is why I was drinking. I recognized that alcohol often gives one freedom to shift blame for one’s own behavior. “Blame it on the booze.” I was never one to act crazy, regardless, but I decided it is better to own my behavior at all times.
There are also some people who drink because they aren’t happy with their lives. While my life does have its challenges (just like everyone else’s), again I would own my behavior and accept my situation, whatever it may be. I want to always be clear-eyed.
So, an initial two week trial period soon became a month. A month became six months. Six months became a year. I attended many parties, social events, and company meals where drinks were consumed by others but not by me. Previous attempts to stop drinking always seemed awkward when I would be out somewhere and the only one not drinking. Not this time around! I have learned that I can still have fun, be myself, entertain, and not drink. I feel no compulsion now whatsoever to drink.
It’s been an investment in my health, both physical and mental. I have lost weight and gotten rid of my gut. I sleep better now and remember my dreams far better than I once did. My mood is better. Most of all, I take pride in who I am and don’t feel the need to cede my power to alcohol.
As my streak continued, I debated whether I would have a celebratory drink on my one-year anniversary. In light of the improvements that this choice has brought me, I don’t feel the need for any celebratory drink. This is a path that has proven worthy of following. I think I will see where it leads.
I am always optimistic with the New Year, but 2023 has already brought me one disappointment: the death of the Dark Sky weather app.
My friend Ben introduced me to Dark Sky many years ago. Unlike most weather apps, Dark Sky focused on micro-forecasts. It would tell you when rain was imminent, how strong it would be, and how long it would last. It was amazingly useful. When I was at Lollapalooza back in 2017 a torrential downpour hit Chicago – so strong that it shut the festival down. But I remained relatively dry because Dark Sky warned me it was coming with about a 10 minute lead time. That allowed me to run to the subway and get out of the weather before the skies opened up.
In 2020, Apple bought Dark Sky and as of yesterday the Dark Sky app no longer works. The functionality has been added, or shall I say buried, within the default Apple weather app.
For a company like Apple that prides itself on UI/UX you would think they would do a better job of it. This is a screenshot of the Dark Sky app just before midnight on the last day of 2022.
With one click you see the current temperature, the rain outlook, and a timeline for how long the rain will last.
In the Apple Weather app you have to open it, scroll down until you find the precipitation widget (and not the precipitation map), click on that and you can kind of figure out the rain forecast if you look hard enough. Here is the prediction for Wednesday.
I mean, I’ll get used to it, but it is sometimes hard to say goodbye to something you’ve used for years. If the app were open source there is a chance it would live on, but when we opt to use proprietary software we also cede a lot of choices to the software vendor. So it goes.
The thing I like most about my job is that I get to meet and work with amazing people. Recently I traveled to Helsinki, Finland, to attend the MariaDB Server Fest conference. It was a great experience and I met some very talented people, including Monty Widenius himself.
Note: The usual disclaimer that this is my personal blog and what I write here does not necessarily reflect the views of my employer, Amazon Web Services
My role at AWS is to work with open source companies and communities and to act as a liaison between them and Amazon. In thinking about important open source projects one of the first that comes to mind is MariaDB.
When I first got seriously involved in open source back in 2001, the MySQL database was an example of an open source success story. While a lot of the focus of the early days of open source was on the operating system, MySQL demonstrated that open source applications were powerful enough to compete with existing proprietary solutions. Plus, if you were building an open source application, quite often you needed a database, and MySQL provided a great option.
Many of us expected MySQL to IPO, but instead the company was bought by Sun Microsystems. That wasn’t too worrisome since Sun was a big proponent of open source, but when Sun was bought out by Oracle a couple of years later, that all changed.
On the day the acquisition was announced, Monty Widenius (the lead developer of MySQL) announced a fork of MySQL called MariaDB. In the years since then, a lot of people have replaced MySQL with MariaDB. While Oracle has continued to work on MySQL, the last major release, version 8.0, came out in April of 2018 so one must wonder how motivated they are to work on a product that competes with their main proprietary offering.
When I learned about Server Fest I decided to attend. As much as I like the ease of remote communication, sometimes nothing beats meeting face to face. I had also been to Helsinki a couple of time before and I really like the city, although I really should try to visit once in the summer time.
Boarding Sign for Flight to Helsinki
I flew from North Carolina to JFK and then took a Finnair flight to Finland. Helsinki is seven hours ahead of New York, so it is one of those weird trips where you leave in the night and land the following afternoon. When I travel I tend to stay at Marriott properties, but all the Marriott affiliated hotels were booked. I later learned that this was because a popular start-up conference called Slush was happening at the same time as Server Fest. Because of this there was no meeting space for rent, so the MariaDB event was being held at Monty’s house, which I thought was kind of cool.
Proposed schedule for Server Fest 2022
The conference was on Thursday, November 17th, and was going to be live-streamed on YouTube. In order to better match up with the time zone in New York, it started around 3pm and ran into the night. I arrived mid-morning.
When you walk into Monty’s house, the first thing you notice is that is has a very open floorplan. Directly across from the entryway is a huge table that can probably seat about 20 people, and that’s where most folks had set up their laptops. To the right of that was a large kitchen, and to the left was an open area where the walls were lined with bookcases, and that is where lights and cameras had been set up for the livestream.
Now MariaDB is organized in two parts. There is the MariaDB corporation, which is the main commercial enterprise behind the project, and there is the MariaDB Foundation, which manages project governance and promotion. Both were represented in the day’s presenters, and I also got to meet and spend a lot of time with Kaj Arnö, who is the CEO of the Foundation.
I also got to meet the true boss of the event, Anna Widenius, Monty’s wife. As you can imagine, getting a bunch of open source geeks organized is like herding cats, but she did a great job in getting the conference underway and keeping it moving.
He talked about “Chasing Bugs in Production”. When Wikimedia upgraded from MariaDB 10.4 to 10.6 they ran into a performance issue. His talk describes their upgrade process and how they were able to work with MariaDB to get the issue addressed. I also found it interesting that they run MariaDB on bare metal. So much of today’s IT infrastructure is based on clouds and Kubernetes that it was refreshing to see someone taking advantage of individual servers when it makes sense.
There was a bit of a hiccup with the second speaker who was supposed to join remotely, so Monty Widenius moved his presentation on query optimization in MariaDB to the second slot.
Monty Widenius Presenting
There are several methods that can be used to execute a query against a database. A good database will optimize the query method to choose the best plan to return the result in fastest time. In MariaDB 11, Monty has changed the method to use a cost-based optimizer (versus rule-based) with parameters that can be tuned by the user. This has resulted in more efficient queries and thus a better user experience.
His topic involved the community, and I really liked his comment that MariaDB is “dependent on its users to give MariaDB its purpose” which I thought was pretty insightful.
MindsDB allows you to integrate machine learning easily into your database. In his example he used a model trained by Hugging Face to analyze text in order to detect “sentiment” – i.e. is the text positive, negative or neutral. And you access this using SQL queries.
For example, supposed you have a blog or other website where users can submit comments. MindsDB would allow you to examine those comments to detect general sentiment without having to learn an entirely new system. I thought it was pretty cool.
This resonated with me as it focused on building the sponsorship community within MariaDB, for both individuals and entities. MariaDB is an important piece of technology and there are a lot of stakeholders, and this talk really reinforced the idea of a “big tent” environment within the project.
For the next presentation we finally got to hear from Federico Razzoli founder of Vettabase (he was originally scheduled to go second but there was some time zone confusion) as he talked about new MariaDB features to learn “for a happy life”.
Federico Razzoli via Zoom with Kaj Arnö
He started off with the comment that MariaDB (and open source projects in general) are very good at creating new features and not so good about documenting or advertising them. He discussed the most recent releases of MariaDB and then highlighted various new features that people should find useful.
The seventh presentation was by Sergey Petrunia and revisited the optimizer, but focused on changes made before Monty’s changes in MariaDB 11.
Sergey Petrunia Presenting
His talk focused on those changes made in the last year (i.e. since the last Server Fest) and it looks like a lot of progress was made to make the optimizer more consistent.
From what I can tell, the idea behind MVCC is that active databases are constantly processing transactions but there is a need to provide a consistent “view” at a given point in time, so MVCC determines which transactions are supposed to be considered committed at that point in time and which are not. This is to prevent someone who is reading from the database from being served incomplete information.
As transparency is key to any open source project, MariaDB publishes statistics on code contributions. The latest one I can find is through September of this year, and I was happy to see Amazon on the list of contributors to the MariaDB Server code.
Of course the majority of code commits, nearly 80% were done by the MariaDB corporation, and another 14% by the MariaDB Foundation. Amazon represented 1.42% of contributions, but Andrew pointed out to me that they came from 14 unique committers versus 8 from the Foundation. I’d love to see that involvement increase.
The mariadb-binlog is a binary log containing a record of all changes to the databases, both data and structure. There is a command line tool that lets you examine this log which now supports Global Transaction IDs, making it easier to filter transactions.
After the Server Fest stream was over, we got to my favorite part of any conference – the socializing.
I did spend some time talking with Manuel Arostegui. One of my friends, Eric Evans, works at the Wikimedia Foundation focused on Cassandra. It turns out that both Manuel and Eric are in the same department. Small world.
Manuel and Me
We eventually sat down to dinner prepared by the Widenius’s. Monty cooked a huge beef tenderloin, and we talked, sang songs and drank. I managed to get back to my hotel about 1am the next morning.
Usually when I travel home from Europe my flight will leave around noon and I get back in the early afternoon local time. For some reason those flights were over $1000 more than the Finnair flight that left at 5pm, so I returned to Monty’s on Friday morning to visit for a few hours.
I loved the fact that Monty was so welcoming and also that he and his family keep a lot of animals (we do the same). In addition to six cats and three dogs, there is a boa constrictor named Monty Python who is about two meters long. The story I heard was that it was a gift given to a family member that ended up at Monty’s. They originally thought it was a python but later learned it was a boa, but the name stuck.
Monty Python the Boa Constrictor
The trip home was uneventful except for the fact that I got home home close to 2am and I ended up catching a bad case of influenza. To my knowledge no one else at the conference got sick, for which I’m happy, and while it knocked me out of commission for almost two weeks it was worth it.
When a platform or a company (or a country, for that matter) becomes more focused on its celebrity leader than on the members or the products or the community, it’s time to leave.
Twitter is now Elon Musk’s private playground, so I no longer want to be a part of it.
I deleted my Twitter account this week. So maybe this is again the place for me to post my random thoughts.
Today marks my three month anniversary with AWS, and I’m loving it. It has been a lot of fun returning to conferences, so I thought I’d post a list of the ones I will be attending for the rest of the year.
The last day of SCaLE was bittersweet, as I didn’t want it to be over but I was also ready to head home.
After stopping by the booth I was eager to visit a session on OpenNMS presented by my friend Jeff Gehlbach.
Jeff has stepped up in the the presenter role I used to have, and he did a very good job of covering what network flows are, the different types and why they are important.
Back in the Exhibit Hall I was happy to learn that the AWS booth had won the “Most Memorable” award.
Hats off to Spot and Ashley for coming up with such a cool concept and creating a great space for people to hang out.
At 1:30pm we held a raffle for a pretty nice 3D printer. You had to be present to win and there was a lot of interest.
Then it was time to tear down the booth as the Exhibit Hall closed at 2pm.
This gave us time to get to the closing keynote by Internet pioneer Vint Cerf.
For someone who recently turned 79 he was a dynamic and entertaining speaker, and it was fun to listen to his stories on creating ARPANET, and how it grew into the public Internet we use today.
He also mentioned Jon Postel several times. I had an e-mail correspondence with Jon in the mid-1990s when I was trying to wrap my brain around the process for getting an “enterprise number” from IANA. I didn’t realize until after his untimely death who he was, and I’m still impressed at how much time he was willing to give a newbie like me.
While I enjoyed the presentation, I did regret that we ran out of time for details on his last slide, which concerned “unfinished business”.
I mean, I get it. Each of the six topics on the slide could be a talk on its own, but I was very curious to hear his thoughts on fixing things such as disinformation. I love living in a world with almost instant access to information and the ability to connect with others, but there are problems, too, and I’m not sure we have the solutions.
All in all I am extremely happy to have been able to attend SCaLE. I’m still not comfortable in crowds and I was a little put out that not everyone in attendance decided to honor the mask policy. I talked with the SCaLE staff and they told me they were doing all they could, but even when people were reminded to mask up they tended to remove them as soon as the staff member walked away.
I was especially unhappy when I saw sponsors going maskless. On the one hand I am happy for their support of SCaLE, but on the other when you are standing in front of your company logo showing a disregard for the safety of your potential customers, it sends a bad message.
I’m not bringing this up to start a debate on the efficacy of masks, as I realize that they provide varying degrees of protection depending on type and use, but if your staff isn’t willing to abide by the conference rules, perhaps you just shouldn’t be there.
Note that I did refrain from posting the pictures I took of specific sponsors since it really wouldn’t change anything. I must be getting soft in my old age.
In any case I hope this is a non-issue for SCaLE 20x in Pasadena next March. I’m not optimistic that the pandemic will be over but for me the risk was worth the benefit, and I can’t wait to return.
Day Three of SCaLE kicked off the start of the main conference, which meant I spent most of the day in the AWS booth.
Traffic was pretty good and I got to talk with a lot of interesting people. I did take a break around 2pm and noticed from Twitter that I was missing a talk by Frank Karlitschek of Nextcloud fame, so I skedaddled over to his room to catch it.
It was pretty good. It focused on how copyleft-style licenses are often better for business since they level the playing field for all contributors, versus a number of newer licenses that are more “source available” instead of “open source”.
Please note that I’m an unabashed Nextcloud fanboy so I have some biases. (grin)
The big evening event was “Game Night” where they turned the basement ballrooms into a big gaming playground. From the classics such as checkers and chess, to Vegas-style games such as roulette and blackjack, up to the most modern of games using VR, there was something for everyone.
AWS sponsored the music for the event, and I was eager to see MC Frontalot perform. He didn’t disappoint.
He did an hour-long set spanning the classics to the newer stuff, including “Secrets From the Future” featuring a video generated using AI.
Afterward he hung out at the merch table to chat with folks, and I got to spend some time with a new friend named Silona Bonewald.
I was introduced to Silona through Spot as she was on the same hotel shuttle bus when we arrived on Wednesday evening. She is in charge of open source at IEEE as well as being a Burner, and I always look forward to chance to talk with her.
Today is the final day of the conference, and remember if you are reading this before 1:30pm PDT there is a raffle for an awesome 3D printer at the AWS booth, so come by to get your ticket.
This is the first conference since joining AWS that I have booth duty, so I won’t be able to spend as much time in the sessions as I would like, but I did want to catch one of the first sessions of the day which was “Speedrunning Kubernetes”.
The main reason I wanted to see this talk was to see Kat Cosgrove in action. Prior to coming to AWS I didn’t know about her but I ended up following her on Twitter and found that she has strong opinions, and I tend to like people who have strong opinions. I figured the presentation would be entertaining and that I might learn something.
I wasn’t disappointed.
The title alludes to a “speedrun” which is an attempt to complete a video game as quickly as possible. The goal of this talk was to bring up a working Kubernetes cluster as if you were doing a speedrun. It also included one of the more … unusual … analogies I’ve seen in a technical presentation (including my own) by using a Chihuahua as a metaphor.
If the goal is to provide the “cheeseburger” application, consisting of the bun service, the patty service, the cheese service, the mustard service, etc., each instance of the application (i.e. each burger) can be considered a “pod”. There are two pods under each foot of the dog representing two-pod “nodes” and the dog forms the control plane.
Remember, now that you’ve seen it, you can’t unsee it.
That was the only session I made on Day Two, but I did get some time to wander around the Exhibit Hall. The Software Freedom Conservancy had a booth, and since they are one of my favorite organizations I stopped by to chat with Pono Takamori. I know a number of folks that work there and they serve as almost a reference implementation for trying to live using 100% free software. Pono was telling me that it was getting almost impossible to find a totally free mobile wireless solution since 3G went away, as all of the modern modems tend to use binary blobs.
Now, when these exhibit halls are being set up, the “booths” are laid out with little generic signs showing the owner of the booth, and most of the time they eventually get covered up once the booth is complete.
I know the Sun acquisition was a long time ago, but I still get cognitive dissonance when I see a MySQL sign next to an Oracle one.
The AWS booth for this conference is really awesome. I bow down to the genius that is Spot Callaway, and he pitched a booth design that was to invoke a teenage geek’s basement, where one might play video games and Dungeons and Dragons (think Stranger Things). The walls of the booth are made to look like brick, and there are chairs, a couch and an SNES console emulator.
The featured AWS project for this conference is Bottlerocket, and I got to learn a bit about it and meet members of the team. Bottlerocket is a minimal operating system designed just to run containers. I compared it to LibreELEC, which is a purpose-built O/S that I use to run Kodi, and while it was explained to me that I was oversimplifying things a bit, it was otherwise a good analogy.
While it is, of course, being used withing AWS, it is a 100% open source project and you can get the code on Github, and the hope is that others will find it valuable and will get involved with the community. If this is something you’re into, stop by the booth and say “hi”.
Speaking of stopping by the booth, we do have some tasty sodas and Bottlerocket branded bottle openers, but the big giveaway is an awesome 3D printer. Get a raffle ticket and stop by the booth at 1:30pm on Sunday for the drawing (you must be present to win).
AWS employees are not eligible to participate. (sniff)
I am back at the Southern California Linux Expo (SCaLE) for the first time in many years, and I was surprised at how happy this makes me. It is always a well run conference and it tends to bring a lot of people I like together in one place, which means I get to meet a lot more people to like as well.
The main SCaLE sessions occur over the weekend, but there are a lot of cool things that happen in the days before. For Thursday, AWS sponsored Cloud Native Builder Day to showcase some of the amazing open source technologies one can use to solve a number of challenges, and I was eager to learn about them.
But before that I needed to get registered. The first step was to show proof of vaccination. While I am thankful that we can have these events, COVID is still a thing and the organizers are doing all they can to mitigate the risk to the conference attendees. Since I’m an old I’ve had two shots and two boosters but the darn thing keeps mutating.
Once past that I headed upstairs where I could use the self check-in kiosks. It was pretty simple to sign in and get my badge printed, and then it was just a short trip down the hall to pick up the conference “swag bag” which included the badge holder and lanyard.
The only change I would make to the process is that once you printed your badge, you should really hit the “close window” button on the screen, as there is a “back” button that could allow the next person who registers to see your name and e-mail. No biggie, but the security nerd in me always thinks about these things.
The conference spans two floors. The Exhibit Hall with the sponsor booths is on the ground floor behind registration (it is technically in the Plaza Ballroom so I just followed the signs for “ballrooms”) while the sessions are on the second floor along with registration. AWS is going to have a pretty cool booth this year.
As an AWS employee I guess I should say that we always have a cool booth (grin) but I especially like the idea behind this one, despite the fact that we were unable to get a mounted deer head (seriously). It’s booth numbers 300, 302 and 304 if you want to swing by, and for those of you who couldn’t make it I’ll be sure to post about it later.
Cloud Native Builder Day showcased three different open source projects, the first one being Triggermesh. This was presented by Jeff Naef who I immediately liked as he was the first to notice that my mask is made by K&N, a company known for their high-end automotive airflow products. He loves performance automobiles as well as open source (he was wearing a Snap-On tools hat) so I knew we would get along.
In dealing with cloud native technologies, a lot of the workflow is event driven. Triggermesh lets you seamlessly link together sources and targets for events, normalizing and enriching them along the way. While it does support the ability to create functions using code (in a variety of languages) a lot of the implementation can be done just through configuration.
In one example the data was encoded in base64, and a person asked if Triggermesh could render that in clear text. Jeff was like, sure, and he bravely set out to implement that as we watched. He got really close, but in any case deserves kudos for the attempt, especially considering he was holding a microphone with one hand the entire time.
The next speaker was Zoe Steinkamp from InfluxDB. I first met Zoe at the Open Source Summit in Austin and she is one of my favorite new acquaintances I’ve met through my job at AWS.
Now full disclosure: I missed the first half of her presentation.
SCaLE has done something delightful with the schedule, which is allowing 30 minutes between talks. I’ve talked about this before but this lets speakers switch out without the usual urgency, allows more time for attendees to interact with the speaker after the talk, and improves the hallway track.
I thought I had enough time to grab lunch, which was In-N-Out that Spot had brought for me. We don’t have In-N-Out in North Carolina so I rarely pass up a chance to get it, and I figured I could be back in time. I was wrong. But I did slip into the back of the room which is why this picture isn’t as close as the others.
I used to work on an open source project that relied heavily on time series data, so I’m a bit of a time series data geek. Every time I see a presentation on InfluxDB I learn more things to like about it. This time I found out that it is possible to get started with it without being a programmer. A lot of people in the data science field aren’t coders, but they can send their data to InfluxDB pretty easily. The folks at Influx have created InfluxDB University as a free resource to get the most out of their solution, and while I haven’t gone through it yet it looks really comprehensive.
When most people hear the word “database” they think of relational databases. This is a data structure usually based on “rows” of data made up of “fields” and indexed by a primary key. One then uses something like the Structured Query Language (SQL) to retrieve values from those fields. This is all well and good but it tends to be extremely monolithic, which doesn’t work well in today’s distributed cloud environment.
Think about it. In a datacenter you might have sub-millisecond latency, so a query can be returned quickly. Move that datacenter across the country, and now it your latency is, say, 100ms. Move that to the other side of the world and, well, you get the picture. Now if you only have a few queries that might be okay, but when you consider thousands and then millions of queries, the response time of your application is going to take a hit.
Cassandra allows you to distribute that data both within a datacenter (for reliability) and also regionally. You can then put your data near your customers, improving their experience.
I was already sold on Cassandra (we used it at OpenNMS) but what I learned from this presentation was the wonderfulness that is “k8ssandra” (kate-sandra). This is Cassandra but running in Kubernetes. If you have ever had to extend and expand a Cassandra cluster, you know that while it isn’t super difficult there are a number of gotchas that can cause problems. What if you could automate it? Matt showed us an example that let him spin up (and tear down) an 800 node cluster in minutes.
Cool, huh?
The first day of SCaLE 19x was a blast, and I am eager to see what the rest of the week brings.
The 19th iteration of the Southern California Linux Expo (SCaLE) is around two weeks away, and I wanted to suggest some reasons why you should attend, assuming you are into free and open source software. AWS, where I work, is a platinum sponsor. The conference runs for four days starting on July 28th and is located at the Los Angeles Airport Hilton.
Note: Everything expressed here represents my own thoughts and opinions and I am not speaking for my employer Amazon Web Services.
I’ve been to a number of SCaLE conferences and I’m always impressed at how well they are run. This is a grass-roots, volunteer-led conference yet it is always at par with the more commercial trade shows I attend and sometimes exceeds them. This year looks exceptionally good.
The first reason you should go is the content. The conference has quite a number of tracks including one focused on containers and orchestration (‘natch) and also infrastructure, security and observability. There are tracks on using open source in the medical field as well as government. Big Data gets its own track as well as embedded systems, and there are several more tracks guaranteed to touch on almost every interest within free and open source software.
The conference spans four days, with the first two days focused more on workshops. Co-located with SCaLE is a two day, two track technical conference focused on PostgreSQL, and on Friday is the tenth DevOps Day LA. AWS is hosting a half-day workshop focused on Cloud Native builders with presentations on Kubernetes, InfluxDB and Apache Cassandra.
The second reason you should go is networking, or what is often called the “hallway track”.
For the last several years I’ve worked remote (i.e. not in an office outside of the home) and I will probably continue to do so for the rest of my career. Remote work has become almost a standard within technical jobs.
But I have to say I miss being able to see people face to face. When I was with OpenNMS we had this product where you could buy a year of support coupled with a week of on-site professional services and training. I used to love doing those, but even before COVID those trips became less frequent as companies adopted a distributed work force. There was really no “on-site” place to go when your team was across four time zones.
Technical conferences, such as SCaLE, provide a great opportunity to get together in person, and it can be wonderful to talk in an informal setting to people you may only know through e-mails, video calls and social media. A number of my coworkers will be at SCaLE and I am looking forward to spending some “in real life” time with them.
If you look through the list of speakers at this year’s conference, it is a “who’s who” of open source leaders and contributors, and you’ll have to the chance to meet them as well as other like-minded people. I love the fact that the organizers have built in a 30 minute cushion between talks. Not only does this avoid the rush that usually happens as one speaker finishes and another sets up, it gives people time to socialize before heading off to the next talk. Of course, it goes without saying that you should be courteous to speakers and other attendees, and SCaLE has published a Code of Conduct to formalize what that means, but also don’t let that stop you from asking tough or difficult questions of the speakers (just be nice about it). I always loved it when I was a speaker and someone asked me something I had never thought about.
The third reason you should go is the Exhibition Hall. There are a ton of sponsors who will have booths at the show (including AWS) and this is a great chance to talk with those projects you love, find new ones to love, and often there is some great swag to be had. The hall will be open on Friday through Sunday.
Finally, on Saturday night there is the famous “Game Night” reception and party. I’m excited that the original nerdcore rapper, MC Frontalot, will be performing. Frontalot combines musicianship with nerdy topics like video games, cosplay, fairy tales and technology into an incredibly entertaining show. If you are new to his work check out his YouTube channel. One of my favorite songs is “Stoop Sale” (kids especially like that one, so I guess I’m a kid at heart), and he recently had a fan take his song “Secrets from the Future” (about how all of our encrypted secrets will one day be an open book) and run the lyrics through the Midjourney AI image generator. The result is pretty amazing.
A full SCaLE pass runs $85, and I can’t think of a better value. In-person technical instruction runs $500+ a day, and even if you went to one of those on-line class sites you’re still going to pay $15-$50 a class, and here you can attend 15 or so sessions for around $5 per, and that doesn’t include all the extra stuff outside of the presentations. Even with travel it is still a deal.
I am very eager to attend and I hope to see you there, too.
Just one more note, this one on COVID. I am pretty rigorous when it comes to avoiding this disease which is one reason I haven’t traveled much in the last 2+ years. The first conference I attended since the pandemic started was the Open Source Summit in Austin, and while some people did test positive it was a small fraction of total attendees. One reason was that they had a mask requirement (except when eating or drinking) and you had to show proof of vaccination or a negative test. SCaLE has adopted a similar policy, and while this won’t mean it is impossible to get sick the evidence suggests that this will greatly limit exposure among the attendees. If you have health issues you may still want to stay home and if you come and don’t feel well use your best judgement. I will be taking along some rapid tests that I got for free from covid.gov as well as frequently taking my temperature just to be sure.
I always feel a little sad on the last day of any conference, and Open Source Summit was no different. It seems like the week went by too fast.
With the Sponsor Showcase closing on Thursday, attendance at the Friday keynotes was light, but those of us that showed up got to hear some pretty cool presentations.
The first one was from Rachel Rose, who supervises R&D at Industrial Light and Magic. As a fanboy of ILM I was very eager to hear what she had to say, and she didn’t disappoint. (sorry about the unflattering picture but I took three and they were all bad)
In the past a lot of special effects that combine computer generated imagery (CGI) and live action are created separately. The live action actors perform in front of a green screen and the CGI backgrounds are added later. Technology has advanced to the point that the cutting edge now involves live action sets that are surrounded by an enormous, curved LED screens, and the backgrounds are projected as the actors perform.
This presents a number of challenges as the backgrounds may need to change as the camera moves, but it provides a much better experience for the actors and the audience.
The tie-in to open source is that a lot of the libraries used the creation of these effects are now open. In fact, the Academy of Motion Picture Arts and Sciences (the people responsible for the Oscars) along with the Linux Foundation have sponsored the Academy Software Foundation (ASWF) to act as a steward for the “content creation industry’s open source software base”. The projects under the ASWF fall into one of two tiers: Adopted and Incubation. Currently there are four projects that are mature enough to be adopted and several more in the incubation stage.
A lot of this was so specific to the industry that it went over my head, but I could understand the OpenEXR project, which provides a reference implementation of the EXR file format for storing high quality images.
She then went on to talk about Stagecraft, which is the name of the ILM platform for producing content. I would love to be able to visit one day. It would be so cool to see a feature being made with the CGI, sets and actors all integrated.
The next speaker was Vini Jaiswal, Developer Advocate for Databricks. I had seen a cool Databricks presentation back on Day 2 and the first part was similar, but Jaiswal skipped the in-depth technical details and focused more on features and adoption. A rather large number of companies are using the Delta Lake technology as a way to apply business intelligence to data lakes, and as the need to analyze normally unstructured data becomes more important, I expect to see even more organizations adopt it.
The third presentation was a video by Dmitry Vinnik of Meta on measuring open source project health.
Begin rant.
To be honest I was a little unhappy to see a video as a keynote. It was the only one for the entire week and I have to admit I kind of tuned it out. It wasn’t even novel, as he has given it at least twice before. The video we were shown is available on Youtube from a conference earlier in the month and he posted another onedated June 24th from the Python Web Conference (while it has a different splash screen it looks to be the same presentation).
Look, I’ve given the same talk multiple times at different conferences, so I get it. But to me keynotes are special and should be unique. I was insulted that I bothered to show up in person, wear a mask, get my temperature checked each day, and I expected something better than a video I could have watched at home.
Note: Rachel Rose played a video as part of her presentation and that’s totally cool, as she didn’t “phone in” the rest of it.
Okay, end rant.
The next two presenters were very inspiring young people, and it was nice to have them included as part of the program.
The first speaker was Alena Analeigh, an amazing young woman who, among other achievements, has been accepted to medical school at age 13 (note that in trying to find a reference for that I came up blank, except for her twitter bio, so if you have one please let me know and I can update this post).
Med school is just one of her achievements. She also founded The Brown STEM Girls as an organization to get more women of color interested in science, technology, engineering and math. She stated that while men make up 52% of the workforce, they represent 76% of people employed in STEM fields.
My love of such things was fostered at an early age, and programs like hers are a great step to encourage young women of color to get interested in and eventually pursue careers in STEM.
While she seemed a little nervous and tentative while presenting, the final speaker of the morning was the exact opposite. At 11 years old, I could listen to Orion Jean speak for hours.
Orion also has a number of accolades, including Time Magazine’s “Kid of the Year“. He got his start as the winner of a speech contest sponsored by Think Kindness, and since then has started the Race to Kindness (“a race where everybody wins”) to spread kindness around the world.
To help inspire acts of kindness he uses the acronym K.I.N.D.:
Keep Your Eyes Open: Look for opportunities to be kind to others. One example he used is one I actually practice. If you are in line to check out at the store, and you see a person with a lot less items than you, while not offer to let them check out first?
Include Others: No one can effect change alone. Get others involved.
Nothing Is Too Small: One thing that keeps us from spreading kindness is that we can try to think too big. Even small acts of kindness can have a huge impact.
Do Something About It: Take action. Nothing can change if we do nothing.
After the keynotes I had to focus on some work stuff that I had let languish for the week, so I didn’t make it to any of the presentations, but overall I was happy with my first conference in three years.
There were a few people that attended who tested positive for COVID, so I plan to take some precautions when I get home and hope that the steps the Linux Foundation took to mitigate infection worked. So far I’ve tested negative twice, and I’ll probably take another test on Monday.
My next conference will be SCaLE in Los Angeles at the end of July, and I plan to be in Dublin, Ireland for Open Source Summit – Europe. If you are comfortable getting out and about I hope to see you there.
Facebook’s algorithms seem to have pegged me as a conservative, which I find amusing but also useful. I get to view ads that have absolutely no relevance to me yet prove to be an insightful look at what kind of red meat right-wing organizations are feeding their gullible followers. Yesterday I saw a provocative Facebook ad that was made to rile up the fearful. The group “Color Us United” is holding a talk next week entitled “How Wake County Is Turning Into Woke County.”
This got me looking into the organization, Color Us United. Color Us United appears to be a Morrisville-based non-profit run by Kenneth Xu, a 24-year-old who it seems makes a living stirring up racial animosity under the guise of condemning it. He’s been profiled in a few of North Carolina right-wingblogs as well as Hill.TV, and his narrative seems to be that racists should not be called out on their racism.
I was interested in knowing more about this organization, so I figured I’d start first by looking at its corporate filings. Color Us United calls itself a 501c(3), so it should have some paperwork somewhere. Let’s do a search on the North Carolina Secretary of State’s website:
Interesting, huh? The Secretary of State does explain that not all charities are required to obtain a charitable solicitation license. Notably, those charities that bring in less that $25,000 a year and don’t compensate anyone are exempt from the license:
Color Us United does solicit donations on its website, where it tells potential donors that its EIN is 85-0513810.
EINs are a handy way to uncover more documents about an organization. A Google Search on 85-0513810 gives us:
So, Color Us United does not appear to be the organization that was registered with the 85-0513180 EIN. Instead, the EIN appears to belong to a nonprofit called “Center for Race and Opportunity,” that seems to have two addresses associated with it: one at PO Box 314, Kingston, NJ; and one at 3781 Westerre Pkwy Suite F, Henrico, VA. The Henrico address appears to belong to a coworking space, according to a check of Google StreetView.
It’s possible that Color Us United has a Doing Business As (DBA) legal arrangement but I have not found anything yet to indicate this.
It appears from entries on the opencorporates and the Virginia-Company website that the Center for Race and Opportunity is in danger of being inactivated, possibly due to some late filings. The only Form 990-N entry shown on the IRS website is one filed for the 2020 tax year.
So what does this all mean? It seems at first glance that the Center for Race and Opportunity and the Color Us United organizations are small potatoes (if they are, indeed, individual organizations – it is hard to tell). It would be tempting to dismiss this as the hobby of a privileged but misguided kid but one thing really stands out here: the Color Us United group has spent over $100,000 on political Facebook ads over the last four years.
A hundred grand ain’t small potatoes, especially to your typical 24-year-old kid. So where is this money coming from? It’s really hard to say as there are few resources available online to track it. I have been unable to find Color Us United listed in any of the databases of the North Carolina Secretary of State. The organization tied to the EIN that Color Us United is using to claim its non-profit status has apparently not filed a recent Form 990N and is listed in databases as pending inactivation. And while a charity that spends $100k in four years on Facebook ads could conceivably be taking in less than $25,000 per year, in my mind this is very close to crossing a line – if not completely obliterating it. I am skeptical that Color Us United/Center for Race and Opportunity is taking in less than $25,000 per year and is not compensating anyone, especially considering that being president of the organization is the only job Mr. Xu lists on his LinkedIn profile at the moment.
Who is paying the bills at Color Us United? Color me unsure. I think it deserves more scrutiny.
I’ve enjoyed these keynotes so far, but to be honest nothing has made me go “wow!” as much as this presentation by Fermyon. I felt like I was witnessing a paradigm shift in the way we provide services over the network.
To digress quite a bit, I’ve never been happy with the term “cloud”. An anecdotal story is that the cloud got its name from the fact that the Visio icon for the Internet was a cloud (it’s not true) but I’ve always preferred the term “utility computing”. To me cloud services should be similar to other utilities such as electricity and water where you are billed based on how much you use.
Up until this point, however, instead of buying just electricity it has been more like you are borrowing someone else’s generator. You still have to pay for infrastructure.
Enter “serverless“. While there are many definitions of serverless, the idea is that when you are not using a resource your cost should be zero. I like this definition because, of course, there have to be servers somewhere, but under the utility model you shouldn’t be paying for them if you aren’t using them. This is even better than normal utilities because, for example, my electricity bill includes fees for things such as the meter and even if I don’t use a single watt I still have to pay for something.
Getting back to the topic at hand, the main challenge with serverless is how do you spin up a resource fast enough to be responsive to a request without having to expend resources when it is quiescent? Containers can take seconds to initialize and VMs much longer.
Fermyon hopes to address this by applying Webassembly to microservices. Webassembly (Wasm) was created to allow high performance applications, written in languages other than Javascript, to be served via web pages, although as Fermyon went on to demonstrate this is not its only use.
The presentation used a game called Finicky Whiskers to demonstrate the potential. Slats the cat is a very finicky eater. Sometimes she wants beef, sometimes chicken, sometimes fish and sometimes vegetables. When the game starts Slats will show you an icon representing the food they want, and you have to tap or click on the right icon in order to feed it. After a short time, Slats will change her choice and you have to switch icons. You have 30 seconds to feed as many correct treats as possible.
Okay, so I doubt it will have the same impact on game culture as Doom, but they were able to implement it using only seven microservices, all in Wasm. There is a detailed description on their blog, but I liked that fact that it was language agnostic. For example, the microservice that controls the session was written in Ruby, but the one that keeps track of the tally was written in Rust. The cool part is that these services can be spun up on the order of a millisecond or less and the whole demo runs on three t2.small AWS instances.
This is the first implementation I’ve seen that really delivers on the promise of serverless, and I’m excited to see where it will go. But don’t let me put words into their mouth, as they have a blog post on Fermyon and serverless that explains it better than I could.
Note: Full disclosure, I am an AWS employee and this post is a personal account that has not been endorsed or reviewed by my employer.
OpenSearch is an open source (Apache 2.0 licensed) set of technologies for storing large amounts of text that can then be searched and visualized in near real time. Its main use case is for making sense of streaming data that you might get from, say, log files or other types of telemetry. It uses the Apache Lucene search engine and latest version is based on Lucene 9.1.
One of the best ways to encourage adoption of an open source solution is by having it integrate with other applications. With OpenSearch this has traditionally been done using plugins, but there is a initiative underway to create an “extension” framework.
Plugins have a number of shortcomings, especially in that they tend to be tightly coupled to a particular version of OpenSearch, so if a new version comes out your existing plugins may not be compatible until they, too, are upgraded. I run into this with a number of applications I use such as Grafana and it can be annoying.
The idea behind extensions is to provide an SDK and API that are much more resistant to changes in OpenSearch so that important integrations are decoupled from the main OpenSearch application. This also provides an extra layer of security as these extensions will be more isolated from the main code.
I found this encouraging. It takes time to build a community around an open source project but one of the best ways to do it is to provide easy methods to get involved and extensions are a step in the right direction. In addition, OpenSearch has decided not to require a Contributor License Agreement (CLA) for contributions. While I have strong opinions on CLAs this should make contributing more welcome for developers who don’t like them.
The next speaker was Taylor Dolezal from the Cloud Native Computing Foundation (CNCF). I liked him from the start, mainly because he posted a picture of his dog:
and it looks a lot like one of my dogs:
Outside of having a cool dog, Dolezal has a cool job and talked about building community within the CNCF. Just saying “hey, here’s some open source code” doesn’t mean that qualified people will give up nights and weekends to work on your project, and his experiences can be applied to other projects as well.
The final keynote was from Chris Wright of Red Hat and talked about open source in automobiles.
Awhile ago I actually applied for a job with Red Hat to build a community around their automotive vertical (I didn’t get it). I really like cars and I thought that combining that with open source would just be a dream job (plus I wanted the access). We are on the cusp of a sea change with automobiles as the internal combustion engine gives way to electric motors. Almost all manufacturers have announced the end of production for ICEs and electric cars are much more focused on software. Wright showed a quote predicting that automobile companies will need four times the amount of software-focused talent that the need now.
I think this is going to be a challenge, as the automobile industry is locked into 100+ years of “this is the way we’ve always done it”. For example, in many states it is still illegal to sell cars outside of a dealership. When it comes to technology, these companies have recently been focused on locking their customers into high-margin proprietary features (think navigation) and only recently have they realized that they need to be more open, such as supporting Android Auto or CarPlay. As open source has disrupted most other areas of technology, I expect it to do the same for the automobile industry. It is just going to take some time.
I actually found some time to explore a bit of Austin outside the conference venue. Well, to be honest, I went looking for a place to grab lunch and all the restaurants near the hotel were packed, so I decided to walk further out.
The Brazos River flows through Austin, and so I decided to take a walk on the paths beside it. The river plays a role in the latest Neal Stephenson novel called Termination Shock. I really enjoyed reading it and, spoiler alert, it does actually have an ending (fans of Stephenson’s work will know what I’m talking about).
I walked under the Congress Avenue bridge, which I learned was home to the largest urban bat colony in the world. I heard mention at the conference of “going to watch the bats” and now I had context.
Back at the Sponsor Showcase I made my way over to the Fermyon booth where I spent a lot of time talking with Mikkel Mørk Hegnhøj. When I asked if they had any referenceable customers he laughed, as they have only been around for a very short amount of time. He did tell me that in addition to the cat game they had a project called Bartholomew that is a CMS built on Fermyon and Wasm, and that was what they were using for their own website.
If you think about it, it makes sense, as a web server is, at its heart, a fileserver, and those already run well as a microservice.
They had a couple of devices up so that people could play Finicky Whiskers, and if you got a score of 100 or more you could get a T-shirt. I am trying to simplify my life which includes minimizing the amount of stuff I have, but their T-shirts were so cool I just had to take one when Mikkel offered.
Note that when I got back to my room and actually played the game, I came up short.
The Showcase closed around 4pm and a lot of the sponsors were eager to head out, but air travel disruptions affected a lot of them. I’m staying around until Saturday and so far so good on my flights. I’m happy to be traveling again but I can’t say I’m enjoying this travel anxiety.
[Note: I overcame by habit of sitting toward the back and off to the side so the quality of the speaker pictures has improved greatly.]
When I first heard the term my thought was that someone had spoken a particular profanity at an inappropriate time, but SBOM in this context means “Software Bill of Materials”. Open source is so prevalent these days that it is probably included in a lot of the software you use and you may not be aware of it, so when an issue is discovered such as Log4shell it can be hard to determine what software is affected. The idea of asking all vendors (both software-only and software running on devices) to provide an SBOM is a first step to being able to audit this software.
It isn’t as easy as you might think. The OpenNMS project I was involved with used over a hundred different open source libraries. I know because I once did a license audit to make sure everything being used had compatible licenses. I also have used Black Duck Software (now Synopsys) to generate a list of included software, and it looks like they now offer SBOM support as well, but I get ahead of myself.
Note that Synopsys is here in the Sponsor Showcase but when I stopped by the booth no one was there.
Getting back to the conference, the second morning keynotes were more sparsely attended than yesterday, but the room was far from empty. The opening remarks were given by Mike Dolan, SVP and GM of Projects at the Linux Foundation, and he was a last minute replacement for Jim Zemlin, who was not feeling well.
Included in the usual housekeeping announcements was a short “in memoriam” for Shubhra Kar, the Linux Foundation CTO who passed away unexpectedly this year.
Dolan also mentioned that the Software Package Data eXchange (SPDX) open standard used for creating SBOMs had turned 10 years old (and it looks like it will hit 11 in August). This was relevant because with applications of any complexity including hundreds if not thousands of open source software projects, there had to be some formal way of listing them for analysis in an SBOM, and most default to SPDX.
She spoke on the work the Linux Foundation is doing to measure the worldwide impact of open source. As part of that she mentioned that there is a huge demand for open source talent in the market place, but there are also policy barriers for employees of many companies to contribute to open source. She also brought up SBOMs as a way to determine how widespread open source use is in modern applications.
Since diversity has been a theme at this conference I wanted to address a pet peeve of mine. This is a slide from Carter’s presentation and it uses a stylized Mercator projection to show the world. I just think it is about time we stop using this projection, as the continent highlighted, Africa, is actually much, much larger in proportion to the other continents than is shown on this map. As an alternative I would suggest the Gall-Peters projection.
To further digress, I asked my friend Ben to run “stylized Gall-Peters projection” through Midjourney but I didn’t feel comfortable posting any of the results (grin).
The goal of Unified Patents is to protect open source from patent trolls. Patent trolls are usually “non-practicing entities” who own a lot of patents but exist to extract revenue from companies they believe are infringing upon them versus building products. Quite frequently it is cheaper to settle than pursue legal action against these entities and this just encourages more actions on the part of the trolls.
The strategy to combat this is described as “Detect, Disrupt and Deter”. For a troll, the most desired patents are ones that are broad, as this means more companies can be pursued. However, overly broad patents are also subject to review, and if the Patent and Trademark Office is convinced a patent isn’t specific enough it can invalidate it, destroying the revenue stream for the patent troll.
I’m on the fence over software patents in general. I mean, let’s say a company could create a piece of software that exactly modeled the human body and how a particular drug would interact with it, I think that deserves some protection. But I don’t think that anyone owns the idea of, say, “swipe left to unlock”. Also it seems like software rights could be protected by copyright, but then again IANAL (one source for more information on this is Patent Absurdity)
The next person on stage was Amir Montazery, of the Open Source Technology Improvement Fund. The mission of the OSTIF is to help secure open source software. They do this through both audits and fundraising to provide the resources to open source projects to make sure their software is secure as possible.
Jennings Aske, of New York-Presbyterian Hospital spoke next. I have worked a bit with technology in healthcare and as he pointed out there are a lot of network connected devices used in medicine today, from the devices that dispense drugs to the hospital beds themselves. Many of those do not have robust security (and note that these are proprietary devices). Since a hack or other breach could literally be a life and death situation, steps are being taken to mitigate this.
I enjoyed this talk mainly because it was from the point of view of a consumer of software. As customers are what drive software revenues, they stand the best chance in getting vendors to provide SBOMs, along with government entities such as the National Telecommunications and Information Administration (NTIA). The NTIA has launched an effort called Software Component Transparency to help with this, and Jennings introduced a project his organization sponsors called DaggerBoard that is designed to scan SBOMs to look for vulnerabilities.
The next keynote was from Arun Gupta of Intel. His talk focused on building stronger communities and how Intel was working to build healthy, open ecosystems. He pointed out that open source is based largely on trust, which is an idea I’ve promoted since I got involved in FOSS. Trust is something that can’t be bought and must be earned, and it is cool to see large companies like Intel working toward it.
The final presenter was Melissa Smolensky from Gitlab who based her presentation around a “love letter to open source”. It was cute. I too have a strong emotional connection to my involvement in free and open source software that I don’t get anywhere else in my professional life, at least to the same degree.
I did get to spend some time near the AWS booth today, and after chatting at length with the FreeRTOS folks I happened to be nearby when Chris Short did a presentation on GitOps.
In much the same way that Apple inspired a whole generation of Internet-focused products to put an “i” in front of their name, DevOps has spawned all kinds of “Ops” such as AIOps and MLOps and now GitOps. The idea of DevOps was built around creating processes to more closely tie software development to software operation and deployment, and key to this was configuration management software such as Puppet and Ansible. Instead of having to manage configuration files per instance, one could store them centrally and use agents to deploy them into the environment. This central repository allows for a high degree of control and versioning.
It is hard to think of a better tool for versioning than git, and thus GitOps was born. Software developed using GitOps is controlled by configuration files (usually in YAML) and using git to make changes.
While I am not an expert on GitOps by any means, suppose your application used a configuration file to determine the various clusters to create. To generate a new cluster you would just edit the file in your local copy of the repo, git commit and git push.
You application would then use something like Flux (not to be confused with the Flux query language from InfluxData) to note that a change has occurred and then do a git pull which would then cause the change to be applied.
Pretty cool, huh? A lot of people are familiar with git so it makes the DevOps learning curve a lot less steep. It also allows for the configuration of multiple repositories so you can control, say, access to secrets differently than the main application configuration.
Also while I was in the booth I got this picture of two Titans of Open Source, Spot Callaway and Brian Proffitt. Oh yeah.
Now as someone who has given a lot of talks, I try to be respectful of the presenter and with the exception of the occasional picture and taking notes I try to stay off my phone. I apologized to her afterward as I was spending a lot of time looking up terms with which I was unfamiliar, such as “ACID” and “parquet“.
Delta Lake is an open source project to create a “Lakehouse”. The term is derived from a combination of “Data Warehouse” and “Data Lake“.
Data warehouses have been around for a very long time (in one of my first jobs I worked for a VAR that built hardware solutions for storing large data warehouses) and the idea was to bring together large amounts of operational data into one place so that “business intelligence” (BI) could be applied to help make decisions concerning the particular organization. Typically this data has been very structured, such as numeric or text data.
But people started figuring out that a lot of data, such as images, needed to be stored in more of a raw format. This form of raw data didn’t lend itself well to the usual BI analysis techniques.
Enter Delta Lake. Based on Apache Spark, it attempts to make data lakes more manageable and to make them as useful as data warehouses. I’m eager to find the time to learn more about this. When I was at OpenNMS we did a proof of concept about using Apache Spark to perform anomaly detection and it worked really well, so I think it is perfectly matched to make data lakes more useful.
My day ended at an internal event sponsored by Nithya Ruff, who in addition to being the chairperson of the Linux Foundation is also the head of the AWS OSPO. I made a number of new friends (and also got to meet Amir Montazery from the morning keynotes in person) but ended up calling it an early night because I was just beat. Eager to be fresh for the next day of the conference.
The main activities for the Open Source Summit kicked off on Tuesday with several keynote sessions. The common theme was community and security, including the Open Source Security Foundation (OpenSSF).
The focus on security doesn’t surprise me. I was reminded of this xkcd comic when the Log4shell exploit hit.
At the time I was consulting for a bank and I called the SVP and said “hey, we really need to get ahead of this” and he was like “oh, yeah, I was invited to a security video call a short while ago” and I was like “take the call”.
I managed to squeeze into the ballroom just before the talks started, and I was happy to see the room was packed, and would end up with a number of people standing in the back and around the edges.
The conference was opened by Robin Bender Ginn, Executive Director of the OpenJS Foundation.
After going over the schedule and other housekeeping topics, she mentioned that in recognition of Pride Month the conference was matching donations to the Transgender Education Network of Texas (TENT) as well as Equality Texas, up to $10,000.
In that vein the first person to speak was Aeva Black, and they talked about how diversity can increase productivity in communities, specifically open source communities, by bringing in different viewpoints and experiences. It was very well received, with many people giving a standing ovation at its conclusion.
The next speaker was Eric Brewer from Google (a platinum sponsor) and his talk focused on how to improve the robustness and security of open source (and he joked about having to follow Black with such a change of topic). Free software is exactly that, free and “as is”. So when something like Log4shell happens that impacts a huge amount of infrastructure, there is really no one who has an implicit obligation to rectify the issue. That doesn’t prevent people from trying to force someone to fix things, as this infamous letter to Daniel Stenberg demonstrates.
Brewer suggests that we work on creating open source “curators” who can provide commercial support for open source projects. In some cases they could be the maintainer, but it is not necessary. When I was at OpenNMS our support offerings provided some of this indemnification along with service levels for fixing issues, but of course that came at a cost. I think it is going to take some time for people to realize that free software does not mean a free solution, but this idea of curators is a good start.
I got the feeling that the next presentation was one reason the hall was so packed as Linus Torvalds and Dirk Hohndel took the stage. Linus will be the first to admit that he doesn’t like public speaking, but I found that this format, where Dirk asked him questions and he responded, worked well. Linus, who is, well, not known for suffering fools gladly, admitted and apologized for his penchant for being rather sharp in his criticism, and when Dirk asked if he was going to be nicer in the future Linus said, no, he probably wouldn’t so he wanted to proactively apologize. That made me chuckle.
This was followed by a security-focused presentation by Todd Moore from IBM, another platinum sponsor. He also addressed trying to improve open source security but took an angle more aimed at government involvement. Digital infrastructure is infrastructure, much like bridges, roads, clean water, etc., and there should be some way for governments to fund and sponsor open source development.
The final keynote for today was a discussion with Amy Gilliland who is the President of General Dynamics Information Technology (GDIT). In a past life I worked quite a bit with GDIT (and you have to admit, that can be a pretty appropriate acronym at times) and it is nice to see a company that is so associated with more secretive aspects of government contracting focusing on open source solutions.
After the keynotes I visited the Sponsor Hall to see the AWS booth. It was pretty cool. As a diamond sponsor it is right in front as you enter.
There were people from a number of the open source teams at AWS available to do presentations, including FreeRTOS and OpenSearch.
I don’t have booth duty this conference so I decided to wander around. I thought it was laid out well and it was interesting to see the variety of companies with booths. I did take some time to chat with the folks at Mattermost.
While I’m a user of both Discord and Slack, I really, really like Mattermost. It is open source and provides a lot of the same functionality as Slack, and you can also host it yourself which is what the OpenNMS Project does. If you don’t want to go to the trouble of installing and maintaining your own instance, you can get the cloud version from Mattermost, and I learned that as of version 7 there is a free tier available so there is nothing preventing you from checking it out.
I did take a short break from the conference to grab lunch with my friend William Hurley (whurley). It had been at least three years since we’d seen each other face to face and, thinking back, I was surprised at the number of topics we managed to cover in our short time together. He is an amazing technologist currently working to disrupt, and in many ways found, commercial quantum computing through his company StrangeWorks. He also made me aware of Amazon Braket, which lets those of us who aren’t whurley to access quantum computing services. I’m eager to check it out as it is an area that really interests me.
Time series data collection and storage was a focus of mine when I was involved in monitoring, and Influx is working to make flexible solutions using open source. Steinkamp’s presentation was on combining data collection at the edge with backend storage and processing in the cloud. Influx had a working example of a device that would monitor the conditions of a plant (she’s an avid gardener) such as temperature and moisture, and this data was collected locally and then forwarded to the cloud. They have a new technology called Edge Data Replication designed to make the whole process much more robust.
I was excited to learn about their query language. Many time series solutions focus so much on obtaining and storing the data and not enough on making that data useful, which to me seems to be the whole point. I’m eager to play with it as soon as I can.
One thing that bothered me was that the hotel decided to have the windows washed in the middle of the presentation.
Steinkamp did a great job of soldiering through the noise and not letting it phase her.
The evening event was held at Stubbs restaurant, which is also a music venue.
I’ve been a fan of Stubbs barbecue sauce for years so it was cool to go to the restaurant that bears his name, even though the Austin location was opened in 1996, a year after Christopher B. Stubblefield died.
It was a nice end to a busy day, and I look forward to Day 2.
Monday was a travel day, but it was notable as it was the first time I have been in an airport since August. I fly out of RDU, and the biggest change was that they now have the “Star Trek” x-ray machines to scan carry-on luggage. While I was panicked for a second when I downloaded my boarding pass and didn’t see the TSA Precheck logo, I was able to get that sorted out so going through security was pretty easy.
The restrictions on masks for air travel have been lifted, but I wore mine along with about 10% of the other travelers. Even though I’ve had four shots and a breakthrough case of COVID I do interact with a lot of older people and since I’ll be around the most people in years at the Open Source Summit I figured I’d wear mine throughout the trip.
And while it isn’t N95, being a car nut I tried out these masks from K&N Engineering, who are known for high end air filtration for performance vehicles, and you almost don’t realize you are wearing a mask.
Anyway, I made my way to the Admiral’s Club and was pleasantly surprised to see it wasn’t very crowded. It was nice to have the membership (it comes with my credit card) as my flight to Charlotte was delayed over 90 minutes. I wasn’t too worried since I had a long layover before heading to Austin, so I was a lot less stressed than many of my fellow travelers.
The flight to Austin left on time and landed early, but we got hit with the curse in that our gate wasn’t available, so we ended up on the tarmac for 45 minutes, getting in 30 minutes late.
Not that I’m complaining. Seriously, according to my handy the trip from my home to Austin by car is 19 hours. From the moment I left my home until we landed was more like 8 hours, and most of that was enjoyable. I always have to remind myself of this wonderful clip by Louis CK which kind of sums up the amazing world in which we live where every time we fly we should be saying to ourselves “I’m in a chair in the sky!”
I checked in at the hotel and then we headed back out in our rented minivan to get the last member of our team, and then we drove about 45 minutes outside of Austin to this barbecue joint called Salt Lick in Driftwood Texas. It was wonderful and I was told we owed this experience to a recommendation years ago from Mark Hinkle, so thanks Mark!
You can’t really tell a good barbecue restaurant by its looks, although shabbier tends to be better, but more by the smell. When you get out of your vehicle your nose is so assaulted with the most wonderful smell you might be drawn to the entrance so quickly that you miss the TARDIS.
We sat at a big picnic table and ordered family style, which was all you could eat meat, slaw, baked beans, bread, pickles and potato salad. I was in such a food coma by the end that I forgot to take a picture of the cobbler.
I tried not to fall asleep on the ride back to Austin (I wasn’t driving) but it was a great start to what I hope is a wonderful week.
Next week I’ll be attending my first conference in nearly three years. My last one turned out to be the very last OSCON back in 2019. Soon after that I was in a bad car accident that laid me up for many months and then COVID happened.
I am both eager and anxious. Even having four vaccine shots and one breakthrough case I still feel a little exposed around large groups of people, but the precautions outlined in the “Health and Safety” section of the conference website are pretty robust and I am eager to see folks face-to-face (or mask-to-mask) once again.
The Linux Foundation’s Open Source Summit used to be known as Linuxcon and now it is an umbrella title for a number of conferences around open source, all of which look cool. My new employer, AWS, is a platinum sponsor and will also have a booth (I am not on booth duty this trip but I’ll be around). I am looking forward to getting to meet in person many of my teammates who I’ve only seen via video, old friends I haven’t seen in years, and to making a bunch of new ones.
Of course, we would have to have a conference in Austin during a heat wave. I was thinking about never leaving the conference venue but then I remembered … barbecue.
If you are going and would like to say “hi” drop me a note on Twitter or LinkedIn or send an e-mail to tarus at tarus dot io.
Recently my friend Jonathan had a birthday, and I sent him a short note with best wishes for the day and to let him know I was thinking about him.
In his reply he included the following paragraph:
[I] was reminded of your comment about a sparsely attended OUCE conference at Southampton one year. You said something along the lines of that it didn’t matter, that you would try to make it the best experience you could for everyone there. That stuck with me. It’s been one of my mantras ever since then.
I can remember talking about that, although I also remember I was very ill during most of that conference and spent a lot of time curled up in my room.
Putting on conferences can be a challenge. You don’t know how many people will show up, but you have to plan months in advance in order to secure a venue. Frequently we could use information about the previous conference to approximate the next one, but quite often there were a number of new variables that were hard to measure. In this case moving the conference from Germany, near Frankfurt, to Southampton in the UK resulted in a lot less people coming than we expected.
It is easy to get discouraged when this happens. I have given presentations in full rooms where people were standing in the back and around the edges, and I have given presentations to three people in a large, otherwise empty room. In both cases I do my best to be engaging and to meet the expectations of those people who were kind enough to give me their attention.
I think this is important to remember, especially in our open source communities. I don’t think it is easy to predict which particular people will become future leaders on first impressions, so investing a little of your attention in as many people as possible can reap large results. I can remember when I started in open source I’d sometimes get long e-mails from people touting how great they were, which was inevitably followed up with a long list of things I needed to do to make my project successful. Other times I’d get a rather timid e-mail from someone wanting to contribute, along with some well written documentation or a nice little patch or feature, and I valued those much more.
I can remember at another OUCE we ended up staying at a hotel outside of Fulda because another convention (I think involving public service vehicles like fire trucks and ambulances) was in town at the same time. There was a van that would pick us up and take us into town each morning, and on one day a man named Ian joined me for the ride. He was complaining about how his boss made him come to the conference and he was very unhappy about being there. I took that as a challenge and spent some extra time with him, and by the end of the event he had become one of the project’s biggest cheerleaders.
In the book Zen and the Art of Motorcycle Maintenance the author Robert Persig demonstrates a correlation between “attention” and “quality”. In today’s world I often find it hard to focus my attention on any one thing at a time, and it is something I should improve. But I do manage to put a lot of attention into person-to-person interactions, and that has been very valuable over the years.
In any case I was touched that Jonathan remembered that from our conversation, and it helps to be reminded. It also motivated me to write this blog post (grin).
When I announced that I had joined AWS, at least two of my three readers reached out with questions so I thought I’d post an update on my onboarding process and impressions so far.
One change you can expect is that when I talk about my job on this blog, I’m going to add the following disclaimer:
Note: Everything expressed here represents my own thoughts and opinions and I am not speaking for my employer Amazon Web Services.
Back when I owned the company I worked for I had more control about what I could share publicly. While I am very excited to be working for AWS and may, at some time in the future, speak on their behalf, this is not one of those times.
A number of people joked about me joining the “dark side”. My friend Talal even commented on my LinkedIn post with the complete “pitch speech” Darth Vader made to Luke Skywalker in Empire. While I got the joke I’d always had a pretty positive opinion of Amazon, gained mainly through being a retail customer.
I recently went and traced what I think to be my first interaction with Amazon back to a book purchase made in December of 1997. In the nearly 25 years I’ve been shopping there I can think of only two times that I was disappointed with their customer service (both involving returns) and numerous times when my expectations were exceeded by Amazon. For example, I once spent around $70 on two kits used to clean high performance automotive air filters. In shipment one of them leaked, and I asked if I could return it. They told me to keep both and refunded the whole $70, even after I protested that I’d be happy with half that.
It was this focus on customer service that attracted me to the possibility of working with Amazon. When I was at OpenNMS I crafted a mission statement that read “Help Customers. Have Fun. Make Money”. I thought I came up with it on my own but I may have gotten inspiration from a Dilbert cartoon, although I changed the order to put the focus on customers. I always put a high value on customer satisfaction.
I have also been a staunch, and I’ll admit, opinionated, proponent of free and open source software and nearly 20 years of those opinions are available on this blog. Despite that, AWS still wanted to talk to me, and as I went through the interview process I really warmed to the idea of working on open source at AWS.
Just before I started I received a note from the onboarding specialist with links to content related to Amazon’s “peculiar” culture. When I read the e-mail I was pretty certain they meant “particular”, as “particular” implies “specific” and “peculiar” implies “strange”. Nope, peculiar is the word they meant to use and I’m starting to understand why. They are so laser-focused on customer satisfaction that their methods can seem strange to people used to working in other companies.
As you can imagine with a company that has around 1.6 million employees, they have the onboarding process down to a science. My laptop and supporting equipment showed up before my start date, and with few problems I was able to get on the network and access Amazon resources. These last two weeks have been packed with meeting people, attending virtual classes with other new hires, and going through a lot of online training. One concept they introduce early on is the idea of “working backwards”. At Amazon, everything starts from the customer and you work backwards from there. After having this drilled into my head in one of the online courses it was funny to watch a video of Jeff Bezos during an All Hands meeting where someone asks if the “working backwards” process is optional.
Based on my previous experience with large companies I was certain of the answer: no, working backwards is not optional. Period.
But that wasn’t what he said. He said it wasn’t optional unless you can come up with something better. I know it is kind of a subtle distinction but it really resonated with me, as it drove home the fact that at Amazon no process is really written in stone. Everything is open to change if it can be improved. As I learn more about Amazon I’ve found that there are many “tenets”, or core principles, and every one of them is presented in the context that these exist until something better is discovered, and there seem to be a lot of processes in place to suggest those improvements at all levels of the company.
If there is anything that isn’t open to change, it is the goal of becoming the world’s most customer-centric company. While a lot of companies can claim to be focused on their customers without many specifics, at Amazon this is defined has having low prices, large selection and a great customer experience. Everything else is secondary.
I bring this up because it is key to understanding Amazon as a company. To get back to my area of expertise, open source, quite frequently open source involvement is measured by things such as number of commits, lines of code committed, number of projects sponsored and number of contributors. That is all well and good but seen through the lens of customer satisfaction they mean nothing, so they don’t work at Amazon. Amazon approaches open source as “how can our involvement improve the experience of our customers?”
(Again, please remember that is my personal opinion based on my short tenure at AWS and doesn’t constitute any formal policy or position)
Note that with respect to open source at AWS, “customer” can refer to both end users of software who want an easy and affordable way to leverage open source solutions as well as open source projects and companies themselves. My focus will be on the latter and I’m very eager to begin working with all of these cool organizations creating wonderful open source solutions.
This focus may not greatly increase those metrics mentioned above, but it is hoped that it will greatly increase customer satisfaction.
So, overall, I’m very happy with my decision to come to AWS. I grew up in North Carolina where the State motto is Esse Quam Videri, which is Latin for “to be rather than to seem”. My personal goal is to see AWS considered both a leader and an invaluable partner for open source companies and projects. I realize that won’t happen overnight and I welcome suggestions on how to reach that goal. In any case it looks like it is going to be a lot of fun.
A side effect of my work on singing has been discovering what tools I need to sound decent. I started with a very good USB microphone a few years ago and then graduated to an inexpensive, 8-channel USB mixer board that I could use with some decent XLR mics I had lying around. When I got my current job, I went out and bought a top-of-the-line Shure SM7B microphone and paired it with my mixer, which got me even closer to the professional sound I wanted. Then I found a used digital sound card, an 8-channel Firewire-based M-Audio 2626 and bought it cheap.
Now, Firewire is essentially an abandoned technology now that Apple no longer ships systems with it, but it is still alive and well in Linux. I took one of my old desktop PCs out of storage, added a hard drive, installed Ubuntu Studio on it, and now have a digital audio workstation (DAW), for dirt cheap! Ubuntu Studio comes with a huge number of audio and video production tools and plugins. It works just fine with this very old M-Audio 2626, too.
My audio tool of choice for editing was once Audacity, but Ubuntu Studio comes with the open-source, ProTools-like DAW called Ardour. I’ve learned how to do some amazing things with manipulating audio using Ardour, simply by diving in and trying different things. I’m sure there is at last 200% more I can be doing with it when I fully understand its capabilities.
Over the past few days and nights, I’ve spent my free time using Ardour to recreate one of my favorite songs, R.E.M.’s These Days. I’ve often looked for old-school karaoke tracks for R.E.M. but there are few that aren’t the hits everyone’s heard a million times already. I did some Google searches to see if anyone’s done this themselves and hit pay dirt when I found a musician named Clive Butler. Clive posted several of his R.E.M. covers to Blogger from 2011-2018 and I thought I’d start with those. Then last week, I discovered he has fresh versions on his very own YouTube channel so I downloaded his version of These Days.
Another fortuitous find was the work of a YouTube user named BaldAndroid. BaldAndroid has remixed many R.E.M. albums, bringing to the fore instruments and voices that were once buried in the release mix. I’ve been able to suss out parts in These Days that I could never make out before, and have used this guide to recreate these sounds in my own version. Suddenly, my two-track project has ballooned to 7 or more tracks, but such is the nature of professional recording. I was happy this afternoon when I put what may be the finishing touches on the project, proud of how closely mine sounds to the original. Then again, I do laugh when I realize what I’ve manged to do is recreate what was state of the art over FORTY YEARS AGO! I still have a lot to learn, obviously!
The little recording room I’ve built right off my office has become my latest happy place. I can close the door, slip on the headphones, and get lost in the recording process. I can spend hours there, cutting and recutting takes, adding effects, getting the timing and levels right, and all the other stuff that goes into making something sound great. In a way I’ve come full circle now, having started off as a recording engineer at Sing-A-Song Recording Studio at Carowinds back in 1987. I’m having fun seeing what more I can do with this.
As I mentioned previously, I’d taken my singing much more seriously over the last few years, practicing for hours each week to improve my technique. At the end of last year, I got good enough to post a few audio clips and videos on a bandmate-finding website called BandMix. It took about a week before a few bands reached out to me, interested to talk to me about fronting their bands. I said yes to one which was a new Creedence Clearwater Revival tribute band but we never rehearsed because of a surge in COVID at the time. I wound up leaving the band and it kind of broke up soon afterward. Then I got interest from a Beatles tribute band, too, but didn’t think the music was varied or interesting enough. Finally, a musician reached out who was interested in the same music I was – and it was across the gamut of styles. My interest was piqued!
In Beaufort, NC, tagging along on one of Kelly’s work trips at the end of December, I got a call from Chuck, the drummer, who proceeded to talk my ear off on all the stuff the band planned to play. A week later, I showed up at the practice space at Kit’s home and sang a few songs for him. He didn’t say much but his ear-to-ear grin told me all I needed to know. Thus, I became the frontman for DNR.
DNR is composed of veteran musicians, many with a decade or more experience playing in bands. As for me, this is my very first band. At our early rehearsals, held almost every Saturday morning, I found myself being stared at by my bandmates, waiting for me to take charge and get us playing. It took me a few beats (ha!) to learn how to actually lead a band, but basically I faked it until I figured out what I was doing. I never considered before how cool and powerful it feels to set this band (or any band) in motion. It’s not something I pondered when I was singing solo to karaoke tracks!
So we rehearsed and rehearsed, picked an interesting setlist, and missed various practices here and there due to vacations, COVID cases, and what have you. Finally, after months of hard work rehearsing, we held our first gig over the Memorial Day weekend: a surprise birthday party for Chuck’s wife, Claudia. There were about two dozen people in attendance and friendly faces at that, but re-watching the video I took I appreciate more and more how heartfelt the applause is that we earned.
As we were returning from a break I noticed Kit, our guitarist, was staring at me and chuckling.
“What? What’d I do? Did I miss something?” I asked him in a panic.
“You’re a natural!” he laughed, still grinning.
It was a great compliment. I could tell he meant it, too.
Being a frontman is more than just singing. I have to introduce the song, create banter with the audience, play percussion, get the tempo right when starting a song, and often adjust the sound board for the best sound. The guys look to me for leadership, which still cracks me up as I don’t really know what I’m doing. I guess I’m good at playing it off, though, or maybe just not caring anymore whether I screw up or not.
It is a very easy band to be in as everyone is sharp and witty. We have some real musicians, and take on music from Steely Dan, The Allman Brothers, The Moody Blues, The Doobie Brothers, Dr. John, and many others. While we were setting up for the private party gig, I played some tunes I thought we might want to add to our repertoire and, to my delight, I heard John (our lead guitarist) and Lance (our keyboardist) start to learn to play them. Such talent!
The other interesting thing about the band is that many of the members are nearly twenty years older than me! David, our bassist, is the closest in age at three years older (he is 56 at this writing). Some have been playing for 40 years or more. Many of the songs we play were new when they were playing.
In addition to frontman and recording engineer duties, I took on the role of social media person, creating the band’s Facebook page, website, and other social media accounts. I’ve used my experience in video streaming to record (and hopefully steam) our performances. At nearly every practice, I’ve recording the audio (and usually video) for critiquing later. I have gigabytes of media now, so much that I’m starting to wonder where I’m going to put it all!
The only downside is that we don’t have regular gigs lined up yet. I am working to motivate the band to get this done so that we are not just playing for our own benefit but entertaining others. It was such a blast performing for people. It definitely brings a new energy to it all. I hope we can get some spots on local stages (preferably outside) so we can continue to generate buzz and grow our following.
I am loving the work I’m finally putting into music. It’s never too late to follow your dreams, huh?
For obvious reasons I’ve been creating some new passwords lately, and I wanted to share my method for creating strong passwords that are easy to remember yet hard to guess.
It does make a lot of sense, but the method has its critics. Attackers can and do use random word generators which can break such passwords more quickly, even with, say, substituting “3” for “e”, etc.
There is also a good argument to be made that we should all be using password managers that generate long random passwords and not really creating passwords at all.
Then there is the very good idea of using two factor authentication, but that tends to augment passwords more than replace them.
In normal life you have to have at least a few passwords memorized, such as the one to get into your device and one to get into your password manager, so I thought I’d share my technique.
I like music, and I tend to listen to pretty obscure artists. What I do is to think of a random lyric from a song I like and then convert that into a password.
For example, right now I’m listening to the album Wet Tennis by Sofi Tukker. The track that gives me the biggest earworm is “Original Sin” which opens with the lyric:
So I think you’ve got
Something wrong with you
Something’s not right with me, too
It’s not right with me
If I were going to turn that into a password, I would come up with something like:
sItUgswwysnrwm,2inrwm
Looks pretty random, and contains lower case and upper case letters, a number and a special character. At 21 characters it isn’t quite as long as “correcthorsebatterystaple” but you can always add more words from the lyrics if needed.
Just thought I’d throw this out there as it works for me. The only thing I have to remember is not to hum the song while logging in.
Last year I wrote about parting ways with the OpenNMS Project and how I was ready for “Act III” of my professional career.
With my future being somewhat of a tabula rasa, I was a bit overwhelmed with choices, so I decided to return to my roots and dust off my consulting LLC. Soon I found myself in the financial sector helping to deploy network monitoring and observability solutions.
I was working with some pretty impressive applications and it was interesting to see the state of the art for monitoring. We’ve come a long way since SNMP. It was engaging and fun work, but all the software was proprietary and I missed the open source aspect.
Recently, Spot Callaway made me aware of an opportunity at Amazon Web Services for an open source evangelist position. Of all the things I’ve done in my career, acting as an evangelist for open source solutions was my favorite thing to do and here was a chance to do it full time. I will admit that Amazon was not the first name that popped into my head when I think “open source” but as I got to learn more about the team and AWS’s open source initiatives, the more interested I became in the position. After I made it through their rather intense interview process and met even more people with whom I’ll be working, it became a job I couldn’t refuse.
So I’m happy to announce that I’m now a Principal Evangelist at AWS, reporting to David Nalley, who, in addition to being a pretty awesome boss is also the current President of the Apache Software Foundation. OpenNMS would not have existed without software from the ASF, and it will be cool to learn, in addition, more about that organization first hand.
My main role will be to work with open source companies as an advocate for them within AWS. The solutions AWS provides can help jumpstart these companies toward profitability by providing the resources they need to be successful and to affordably grow as their needs change. While I am just getting started within the organization and it will take me some time to learn the ropes, I am hoping my own experience in running an open source business will provide a unique insight into issues faced by those companies.
Exciting times, so watch this space as my open source adventures continue.
I’m not sure how I feel about this. When I was first made aware of its existence oh so many years ago I was both flattered and a little embarrassed, mainly because I didn’t think I rated a page on Wikipedia. But then I got to thinking that, hey, pretty much anyone should be able to have a page on Wikipedia as long as it adheres to their format guidelines. It’s not like it takes up much space, and as long as the person is verifiable as being a real person, why not?
I am certain I would have been okay with my page being deleted soon after it was created, but once you get used to having something, earned or not, there is a strong psychological reaction to having it taken away. From what I can tell the page was created in 2010, so it had been around for nearly 12 years with no one complaining.
The most hurtful thing was a comment about the deletion from EdwardX from London:
Nothing cited in the article counts towards WP:GNG, and I can find nothing better online. Run-of-the-mill person.
Really? Was the “Run-of-the-mill person” comment really necessary? (grin)
I’m still happy about what I was able to accomplish with OpenNMS and building the community around it, even if it was run-of-the-mill, and I plan to promote open source and open source companies for the remainder of my career, even if that isn’t Wikipedia-worthy.
You may have heard of the recently-discovered/-published TLStorm vulnerability that affects – at least – APC SmartUPS devices.
One of the prime issues highlighted is the embedded nanoSSL library that APC has used on these systems.
If you want to find out if your system is affected, the following nmap except should start you towards a solution:
for octet in {30..39}; do (nmap -A -T4 192.168.0.$octet > nmap-192.168.0.$octet.out &) ; done
This will kick-off nmap to run in parallel. When they all finish (you can monitor how many are running using ps aux | grep nmap), you can then process the files rapidly thusly:
grep -i nano nmap*.out
If nanoSSL has been found, you’ll get a listing of all IPs running it (since you cleverly named your files with the IP in the name).
The mitigations you choose to implement have been explained well in the articles linked above, but finding these systems can be a pain.
Nineteen years ago my friend Ben talked me into starting this blog. I don’t update it as frequently any more for a variety of reasons, specifically because more people interact on social media these days and I’m not as involved in open source as I used to be, but it is still somewhat of an achievement to keep something going this long.
My “adventures” in open source started out on September 10th, 2001, when I started a new job with a company called Oculan to work on their open source monitoring platform OpenNMS. In May of 2002 I became the lead maintainer on the project, and by the time I started this blog I’d been at it for several months. Back then blogs were one of the main ways an open source project could communicate with its community.
The nearly two decades I spent with OpenNMS were definitely an adventure, and this site can serve as a record of both those successes and those struggles.
Nineteen years ago open source was very different than it is today. Today it is ubiquitous: I think it would be rare for a person to go a single day without interacting with open source software in some fashion. But back then there was still a lot of fear, uncertainty and doubt about using it, with a lot of confusion about what it meant. Most people didn’t take it seriously, often comparing it to “shareware” and never believing that it would ever be used for doing “real” things. On a side note, even in 2022 I recently had one person make the shareware comparison when I brought up Grafana, a project that has secured nearly US$300 million in funding.
Back then we were trying to figure out a business model for open source, and I think in many ways we still are. The main model was support and services.
You would have thought this would have been more successful than it turned out to be. Proprietary software costing hundred of thousands if not millions of dollars would often require that you purchase a maintenance or support contract running anywhere from 15% to 25% of the original software cost per year just to get updates and bug fixes. You would think that people would be willing to pay that amount or less for similar software, avoiding the huge upfront purchase, but that wasn’t the case. If they didn’t have to buy support they usually wouldn’t. Plus, support doesn’t easily scale. It is hard finding qualified people to support complex software. I’d often laugh when someone would contact me offering to double our sales because we wouldn’t have been able to handle the extra business.
One company, Red Hat, was able to pull it off and create a set of open source products people were willing to purchase at a scale that made them a multi-billion dollar organization, but I can’t think of another that was able to duplicate that success.
Luckily, the idea of “hosted” software gained popularity. One of my favorite open source projects is WordPress. You are reading this on a WordPress site, and the install was pretty easy. They talk about a “five minute” install and have done a lot to make the process simple.
However, if you aren’t up to running your own server, it might as well be a five year install process. Instead, you can go to “wordpress.com” and get a free website hosted by them and paid for by ads being shown on your site, or you can remove those ads for as little as US$4/month. One of the reasons that Grafana has been able to raise such large sums is that they, too, offer a hosted version. People are willing to pay for ease of use.
But by far the overwhelming use of open source today is as a development methodology, and the biggest open source projects tend to be those that enable other, often proprietary, applications. Two Sigma Ventures has an Open Source Index that tries to quantify the most popular open source projects, and at the moment these include Tensorflow (a machine learning framework), Kubernetes (a container orchestration platform) and of course the Linux kernel. What you don’t see are end user applications.
And that to me is a little sad. Two decades ago the terms “open source” and “free software” were often used interchangeably. After watching personal computers go from hobbyists to mainstream we also saw control of those systems move to large companies like Microsoft. The idea of free software, as in being able to take control of your technology, was extremely appealing. After watching companies spend hundreds of thousands of dollars on proprietary software and then being tied to those products, I was excited to bring an alternative that would put the power of that software back into the hands of the users. As my friend Jonathan put it, we were going to change the world.
The world did change, but not in the way we expected. The main reason is that free software really missed out on mobile computing. While desktop computers were open enough that independent software could be put on them, mobile handsets to this day are pretty locked down. While everyone points to Android as being open source, to be honest it isn’t very useful until you let Google run most of it. There was a time where almost every single piece of technology I used was open, including my phone, but I just ran out of time to keep up with it and I wanted something that just worked. Now I’m pretty firmly back into the Apple ecosystem and I’m amazed at what you can do with it, and I’m so used to just being able to get things going on the first try that I’m probably stuck forever (sigh).
I find it ironic that today’s biggest contributors to open source are also some of the biggest proprietary software companies in the world. Heck, even Red Hat is now completely owned by IBM. I’m not saying that this is necessarily a bad thing, look at all the open source software being created by nearly everyone, but it is a long way from the free software dream of twenty years ago. Even proprietary, enterprise software has started to leverage open APIs that at least give a nod to the idea of open source.
We won. Yay.
Recently some friends of mine attended a fancy, black-tie optional gala hosted by the Linux Foundation to celebrate the 30th anniversary of Linux. Most of them work for those large companies that heavily leverage open source. And while apparently a good time was had by all, I can’t help but think of, say, those developers who maintain projects like Log4j who, when there is a problem, get dumped on to fix it and probably never get invited to cool parties.
Open source is still looking for a business model. Heck, even making money providing hosted versions of your software is a risk if one of the big players decides to offer their version, as to this day it is still hard to compete with a Microsoft or an Amazon.
But this doesn’t mean I’ve given up on open source. Thanks to the Homebrew project I still use a lot of open source on my Macintosh. I’m writing this using WordPress on a server running Ubuntu through the Firefox browser. I still think there are adventures to be had, and when they happen I’ll write about them here.
Over the two-year course of this COVID-19 pandemic, I have taken extra steps to keep myself and my family safe. I’ve kept abreast of the latest medical advice and research. I’ve invested in N95 and KN95 masks. I’ve hauled around my HEPA air filter to places where proper ventilation would be hard to come by. Most importantly, whenever I’ve had the slightest concern that any health symptoms I’d been experiencing might have been COVID, I have gotten tested with Wake County’s free PCR COVID tests. Six times I’ve done this, and six times I received a relieving result of negative. Most recently, we were shipped a set of four COVID antigen tests free from the government, and a test using one of those turned up negative, too.
I kept my precautions up, thinking I had succeeded in avoiding an COVID infection. It turns out I may have been wrong and didn’t even know it.
Last week, I noticed that one of my right toes was a little stingy and looked bruised. I didn’t recall injuring it so I wondered if it might be the “COVID toes” I’d heard about. See, COVID patients reported sores on their toes (mainly. Fingers may be involved, too), and my toe looked suspiciously like this. COVID attacks the vascular system in addition to everything else it hits, and red toes can be a symptom. Around that time, I had an attack of my Reynaud’s Syndrome, with some of my fingers turning numb and white for over an hour. This red toe effect could also be caused by Reynaud’s (which is also a vascular disease), so I couldn’t say for sure what was what. Thus, I popped open the antigen test and 15 minutes later it told me I was COVID negative. Sure, an antigen test is not as accurate as a PCR test but this was at the height of my symptoms so I assumed if I was going to pop positive on anything it would be right at that moment. But, no, it was negative!
Over the weekend, I got to thinking about how my body reacted to the primary, secondary, and booster COVID vaccines I had gotten. Basically, I didn’t react at all! There were no noticeable side-effects whatsoever. I was thinking about this and deciding that perhaps my reaction to the actual virus would be a similar non-event. I decided to contact the VA to schedule a COVID antibody test, knowing that this might show whether I’d been exposed and didn’t know it.
I got the blood drawn this past Monday morning. The result came back the next day and, like I had started to suspect, it was positive. My body has SARS-COV-2 antibodies.
Now, experts caution that this does not necessarily mean I had been exposed to SARS-COV-2 (COVID-19), only that my body knows how to fight it. It could be that sometime in the past I’d been exposed to a similar coronavirus. However, I think it’s unlikely that it was anything other than one variant of COVID-19 or another. Most likely the omicron variant, the highly virulent one responsible for more than 95% of current infections.
Just knowing I have antibodies, though, is a huge weight off of my shoulders. And if I was infected, the odds are high that my wife, son, and possibly my daughter have also had it and didn’t know it. My son Travis has taken twice as many tests as I have and had them all come back negative. He was astonished to test negative one time after eating lunch in a closed car with his school buddies, many of whom tested positive. To me, that seems like evidence that Travis had already seen COVID and was immune. AT the time he credited his vaccines but there may be more to it than that. I think he feels better now, knowing that he, also, might have antibodies, though we still need to confirm this with a test.
I suppose if I had to get a positive COVID test result, this is the one to get. I’m glad I continued to protect folks outside of my bubble (and I will continue to do so), but with a teen in the home and one dropping in on a regular basis, it was inevitable that it was going to pass through at some point. Crazy that I never even knew it.
I think the title of this post is a little misleading, as I don’t have any news about Nextcloud. Instead I want to talk about the News App on the Nextcloud platform, and I couldn’t think of a better one.
I rely heavily on the Nextcloud News App to keep up with what is going on with the world. News provides similar functionality to the now defunct Google Reader, but with the usual privacy bonuses you expect from Nextcloud.
Back before social networks like Facebook and Twitter were the norm, people used to communicate through blogs. Blogs provide similar functionality: people can write short or long form posts that will get published on a website and can include media such as pictures, and other people can comment and share them. Even now when I see an incredibly long thread on Twitter I just wish the author would have put it on a blog somewhere.
Blogs are great, since each one can be individually hosted without requiring a central authority to manage it all. My friend Ben got me started on my first blog (this one) that in the beginning was hosted using a program called Moveable Type. When their licensing became problematic, most of us switched to WordPress, and a tremendous amount of the Web runs on WordPress even now.
Now the problem was that the frequency that people would post to their blogs varied. Some might post once a week, and others several times an hour. Unless you wanted to go and manually refresh their pages, it was difficult to keep up.
RSS is, as the name implies, an easy way to summarize content on a website. Sites that support RSS craft a generic XML document that reflects titles, descriptions, links, etc. to content on the site. The page is referred to as a “feed” and RSS “readers” can aggregate the various feeds together so that a person can follow the changes on websites that interest them.
Google Reader was a very useful feed reader that was extremely popular, and it in turn increased the popularity of blogs. I put some of the blame on Google for the rise of the privacy nightmare of modern social networks on their decision to kill Reader, as it made individual blogs less relevant.
Now in Google’s defense they would say just use some other service. In my case I switched to Feedly, an adequate Reader replacement. The process was made easier by the fact that most feed readers support a way to export your configuration in the Outline Processor Markup Language (OPML) format. I was able to export my Reader feeds and import them into Feedly.
Feedly was free, and as they say if you aren’t paying for the product you are the product. I noticed that next to my various feed articles Feedly would display a count, which I assume reflected the number of Feedly users that were interested in or who had read that article. Then it dawned on me that Feedly could gather useful information on what people were interested in, just like Facebook, and I also assume, if they chose, they could monetize that information. Since I had a Feedly account to manage my feeds, they could track my individual interests as well.
While Feedly never gave me any reason to assign nefarious intentions to them, as a privacy advocate I wanted more control over sharing my interests, so I looked for a solution. As a Nextcloud fan I looked for an appropriate app, and found one in News.
News has been around pretty much since Nextcloud started, but I rarely hear anyone talking about its greatness (hence this post). Like most things Nextcloud it is simple to install. If you are an admin, just click on your icon in the upper right corner and select “+ Apps”. Then click on “Featured apps” in the sidebar and you should be able to enable the “News” app.
That’s it. Now in order to update your feeds you need to be using the System Cron in Nextcloud, and instructions can be found in the documentation.
Once you have News installed, the next challenge is to find interesting feeds to which you can subscribe. The news app will suggest several, but you can also find more on your own.
It used to be pretty easy to find the feed URL. You would just look for the RSS icon and click on it for the link:
But, again, when Reader died so did a lot of the interest in RSS and finding feed URLs more became difficult. I have links to feeds at the very bottom of the right sidebar of this blog, but you’d have to scroll down quite a way to find them.
But for WordPress sites, like this one, you just add “/feed” to the site URL, such as:
There are also some browser plugins that are supposed to help identify RRS feed links, but I haven’t used any. You can also “view source” on a website of interest and search for “rss” and that may help out as well.
My main use of the News App is to keep up with news, and I follow four main news sites. I like the BBC for an international take on news, CNN for a domestic take, Slashdot for tech news and WRAL for local news.
This wouldn’t be as useful if you couldn’t access it on a mobile device. Of course, you can access it via a web browser, but there exist a number of phone apps for accessing your feeds in a native app.
Now to my knowledge Nextcloud the company doesn’t produce a News mobile app, so the available apps are provided by third parties. I put all of my personal information into Nextcloud, and since I’m paranoid I didn’t want to put my access credentials into those apps but I wanted the convenience of being able to read news anywhere I had a network connection. So I created a special “news” user just for News. You probably don’t need to do that but I wanted to plant the suggestion for those who think about such things.
It sometimes gets out of sync and I end up having to read everything in the browser and re-sync in CloudNews, but for the most part it’s fine.
For Android the best app I’ve used is by David Luhmer. It’s available for a small fee in the Play Store and for free on F-Droid.
Like all useful software, you don’t realize how much you depend on it until it is gone, and in the few instances I’ve had problems with News I get very anxious as I don’t know what’s going on in the world. Luckily this has been rare, and I check my news feed many times during the day to the point that I probably have a personal problem. The mobile apps mean I can read news when I’m in line at the grocery store or waiting for an appointment. And the best part is that I know my interests are kept private as I control the data.
If you are interested, I sporadically update a number of blogs, and I aggregate them here. In a somewhat ironic twist, I can’t find a feed link for the “planet” page, so you’d need to add the individual blog feeds to your reader.