2017-09-11

Goodbye Blogger, Hello Jekyll.

I'm just tired of Blogger, and want something that works using the same tools I use for my everyday work, so I'm moving to a Jekyll-based blog, hosted over on Github.

If you want to keep following, dig me at https://bear454.github.io . I'm sure I'll migrate my main domain at some point as well.

So long, and thanks for all the fish.

2017-01-20

'Web' applications and the circle of life.

Dijoantonycj (https://commons.wikimedia.org/wiki/File:8-UX-Pitfalls-To-Avoid-In-Mobile-App-Design.jpg), „8-UX-Pitfalls-To-Avoid-In-Mobile-App-Design“, https://creativecommons.org/licenses/by-sa/4.0/legalcode


Web technologies are a funny thing.  There's no doubt that web applications have improved far beyond what was imagined in the early days of the web... everything is more performant, more interactive, and designs are more immersive. JS engines have gotten faster, HTML specifications have embraced more resources, including local integration, and CSS has grown to allow rich, robust styling. We continue to push the envelope with technologies that will serve the web faster, like HTML 2 and websockets. Web development has become a huge field. The combination of fast moving technology, developed by a growing workforce has, inevitably, allowed "web" technologies to migrate into other areas: node.js runs many server-side applications; the gnome desktop relies heavily on CSS and JS for implementing basic UI elements; HTML documentation pervades.

It's natural, then, that user-facing applications would escape the browser frame. In multiple different iterations now, "web applications" have become just "applications" on the desktop or mobile platform. Somehow, though, they're never 'good enough' for general adoption by both the consumer and developer communities: adoption rises and falls like waves on the sea, each wave a new round of applications driven by advances in web technologies, only to be eclipsed by 'native' (C++, CLR/.net, Objective-C, GTK/Qt) applications, over and over. Why? And when will web applications truly be 'good enough'?

I noticed, as I installed yet-another Electron app, that I'm sitting at the top of another wave, while simultaneously, a prior wave of web applications is being eclipsed, and it made me reflect on just how many times we've been through this.

In 2007, Apple released the iPhone, with the expectation that developers would use mobile Safari to build apps, for which 'installation' meant, mostly, adding a shortcut on the phone. Steve Jobs has been quoted[1]:

The full Safari engine is inside of iPhone. And so, you can write amazing Web 2.0 and Ajax apps that look exactly and behave exactly like apps on the iPhone. And these apps can integrate perfectly with iPhone services. They can make a call, they can send an email, they can look up a location on Google Maps.
And guess what? There’s no SDK that you need! You’ve got everything you need if you know how to write apps using the most modern web standards to write amazing apps for the iPhone today. So developers, we think we’ve got a very sweet story for you. You can begin building your iPhone apps today.

The App Store is largely viewed as a response to early jailbreakers, and complaints from a developer community that did not want to embrace web technologies, and since, the volume of (native) apps in the App Store has been a regular measure of the success of the platform.

In 2012, Mozilla once again brought web-based applications to the forefront, launching FirefoxOS which was described as "powered completely by Web technologies and offer everything you expect from a smartphone"[2]. In 2016, Mozilla demoted the project, although this is typically associated with the lackluster sales, and continuous stream of negative reviews, as Mozilla pushed toward their ultra-low price point goal of a $25 smartphone.

Google has found much more success with ChromeOS, and the broad market of Chromebooks available with the lightweight, web-centric OS. While early announcements appeared in 2009, the first hardware didn't ship until mid 2011, the CR-48 from Google. ChromeOS may have had a slow start, but it quickly picked up steam, and hasn't slowed, despite having, basically, one native app: the Chrome web browser. Chrome apps, which run on both desktop Chrome and ChromeOS, "let you use HTML5, CSS, and JavaScript to deliver an experience comparable to a native application."[3].

Despite accelerating sales of Chromebooks[4][5], fed by their low price points, security improvements based largely on their small footprint and native image delivery systems, and their ease of administration, Google still felt pressure to bring 'native' apps to ChromeOS. This is, ironically, a complete inversion of the native application. With Chrome as the foundation, and the Android marketplace chosen as the source for a larger application pool, we've seen an evolving series of Android emulation layers be bolted onto Chrome... while this does pave the way for the larger pool of Android apps to run on Chromebooks, Chrome remains the native environment here, and Android apps will, for the near term at least, be burdened by the performance losses inherent in any emulation layer.

The newest wave of Chromebooks feature much beefier specs than their web-centric predecessors, driving up price points from the $200 range on average to the $500 range[6][7], largely in support of Android apps, but also to meet the needs of power users who like the basic design principles of an OS with a smaller attack surface, simplified maintenance, and easy access to developer tools, over increasingly expensive Macbooks[8] or Windows machines whose resources are ever-more consumed by the OS itself[9].

And yet, simultaneously, web applications are rising again. Github's Electron[10] project paves the way for desktop applications based on chromium (the open source upstream of Google Chrome) and Node.js, with tooling and instructions around packaging for Windows, MacOS, and Linux (!!!). Electron grew out of the Atom project[11], an effort to provide an open-source, extensible code editor based on web technologies. (Full disclosure... Atom is my IDE of choice... for now.) Electron, serendipitously, released in the midst of a wave of next-generation communication tools, and has found it's killer app in providing an easy way for Slack[12], rocket.chat[13], Riot[14], Wire[15] and Discord[16] to easily move their web-based applications to the desktop. Today, I installed yet-another Electron app just recently: Simplenote[17], an open-source tool for jotting quick notes and syncing them _everywhere_. I stepped back and realized that nearly all the new apps I'm trying, and most of the apps I'm using, are web-based. Either packaged Electron apps like Atom and rocket.chat, or Chrome apps/frames like Trello[18], Signal[19], and Autodesk's TinkerCAD[20]. And the native apps on my desktop? Web browsers mostly (both Chrome and Firefox), along with a couple of 'legacy' apps: Pidgin for old-school chat, and Evolution, for connection to my work mail.

So, where are you in the lifecycle of web applications? Riding the crest of Electron apps, or waiting for that shiny new Chromebook that lets you run emulated Android apps? Or, do you live in iOS or Android, and just can't even tell (or care) what technologies are underneath the hood of your apps? Are you surprised to find more web technology on your desktop than you expected? I'd love to hear!


[1] https://9to5mac.com/2011/10/21/jobs-original-vision-for-the-iphone-no-third-party-native-apps/
[2] https://blog.mozilla.org/blog/2013/07/01/mozilla-and-partners-prepare-to-launch-first-firefox-os-smartphones/

2016-11-01

Pumpkin Art

One of the oddities of holidays is how language changes context. Aside from the week before Halloween, imagine the rarity of the question "what are you carving on your pumpkin?"

Having heard this question a fair number of times last week, I reflected on some of my own pumpkin art, and wanted to share.
 

Impossible problems, simple solutions

Everyday we confront problems with no viable solution. Sometimes we simply accept them; sometimes we struggle against them for years. Some are critical, but most are simple annoyances; to the careful negative observer: subtle, regular signs of failure.

For example, let me share a ridiculous lifelong struggle: microwaving a frozen burrito. This, it turns out, is an impossible problem. Sure, any idiot can throw a burrito in the microwave for a couple minutes, and have something edible, but it won't be good. The center might still be cold, or the tortilla may be hard as a rock. The filling may boil out, and most likely, the wrap will be soggy on at least one side. In short, there is nothing you can do to make a frozen burrito come out of a microwave well cooked. I should know; I've tried every possible thing. I've cooked it fast, and slow, and slow then fast. I've cooked it on a wet towel, under plastic wrap, on a special microwave cooking plate. It's impossible.

If this were code, the solution would be clear: deconstruct the problem. Break it up into smaller problems, and with enough iterations and subdivisions, eventually you're not even solving smaller problems, but just following the obvious, simple course from problem to solution. On the rare occasion that doesn't work, change the scope: go back to the original premise... you're likely to find you've made an incorrect assumption. Even for software developers trained in this mindset though, applying this technique to everyday problems isn't obvious. If it was, there wouldn't be frozen burritos.

I'm writing this for a couple reasons: aside from the obvious advice, reminding developers that the tools we have for coding apply in real life as well, I'm also marking a turning point for this blog... which has long been an impossible problem for me. As much as I've wanted to fill it with good content centered around open-source technologies, I'm rarely able to find a subject that isn't better covered elsewhere. I want to blog more. I've been asked to blog more, but always the subjects elude me. It turns out the simple solution was a problem of scope: like this post, the things I want to share, that aren't said enough, simply aren't related to open-source. So, starting with this post, I'm solving the problem... by changing the scope. From here on in, I'm writing about whatever I want, whatever I feel needs to be said, shared, repeated. Stick around, and see if we can have some interesting conversations.

P.S. Here's the solution to frozen burritos:

  • Microwave it whole, at high power, just long enough to defrost the tortilla completely.
    Usually 45 - 75 seconds.
  • Open it up and scrape the filling out into a bowl.
  • Microwave the filling at medium heat until it's at a safe temp - I look for boiling.
    Usually 3 - 4 minutes at 60%.
  • Spoon the filling back into the tortilla, wrap it back up, and cook the whole thing for 30 seconds to get the tortilla warmed up nicely.
  • Enjoy!

2016-02-01

(As a rubyist) Python sucks.

I've been writing Ruby for almost 9 years now, and I love it. Ruby, for those who don't know it, is an extremely expressive, natural coding language with roots in Perl and Smalltalk. I comment less in Ruby than any other language I've ever used, because the code tends to be clear and self-documenting. The community has a great ethos, driven by concepts like TDD/BDD (test or behavior driven development), DRY (don't repeat yourself), YAGNI (you ain't gonna need it), and convention-over-configuration.

I've been writing Python for about 18 months now, and it's an amazingly powerful language, with a huge wealth of libraries and resources. I don't love it. In fact, the more I use it, the less I like it. Doing a project in Django has really spiked that feeling. It's not that Python sucks, per se, I just feel like a lot of the lessons I've learned in the last 35 years of coding, and many of the concepts I've gotten used to as a Rubyist just don't apply, and that makes me grumpy.

This Thursday, February 4th, I'll be presenting for the Bellingham Linux Users Group, where I'll expound further, hopefully amusingly and enlighteningly, on this topic.

Thursday, Feb 4, at 7pm in BTC room CC201.