How to use very latest Bundler in Travis

TL;DR: When Bundler has fixes in master that you need, use the specific_install Rubygems plugin to install and use Bundler directly from a git branch. Example Travis YAML configuration excerpt:

before_install:
  - gem update --system
  - gem install specific_install
  - gem specific_install https://github.com/bundler/bundler.git
Picture: taken at Cambridge U Library. This is a CC0 free stock photo. But those are nice columns surrounding and wrapping the body of the page. Also: note the shadows of the supporting lines still visible in the finished book.

I will spend the rest of this post unpacking what the above means.

In Ruby, there is a package manager called Bundler. It’s continuously developed, and like all software, can have bugs.

When a Bundler bug gets fixed, the code changes are “merged to master branch”. Then we all wait for the next release. The workarounds are about sticking to an older, non-broken version: use the -v option to choose an exact release. Example: gem install bundler -v'1.13.7'

The Rubygems system does not have a way to install gems “from a Git source URI”, but it does have a plugin system. And luckily, one of the plugins available (but not on that page) is specific_install.

allows you to install an “edge” gem straight from its github repository, or install one from an arbitrary url

In order to work around the “we all wait for the next release” step, we can install the latest and greatest using this plugin.

The plugin’s has also aliased its specific_install action to git_install. The manual claims:

This alias is shorter and is more intention-revealing
of the gem's behavior.

Source: The above Travis configuration snippet comes from a PR to the releaf project.

Cinemateket: a personal failure to build habit

The new slimmer Cinemateket. It never really came together, my going there out of habit. Habit-forming failure, if you will.

Now that they’ve scaled it all back to a trickle of must-see films from the nowadays-much-smaller showable-as-analog past, it’s lost much of its previous smorgasbord charm for me. Choosing not to view a certain film isn’t so very charming when it’s the only film.

In unrelated habit-building: My quest to see all of Star Trek: The Next Generation, with the redoubtable Sir Patrick Stewart intoning the Federation‘s credo about not meddling where one should not meddle, continues. if a little abated. I’ve noted a tendency to fall asleep at the tail end of the story, in the late act II. Note to self: perhaps one episode is enough for a night.

Recently, I have met several trekkies, whereas before I seldom met anyone who professed to know the slightest thing about the Star Trek franchise.  Of these enthusiasts, one is a lawyer, one a nurse who speaks Chinese, and one works with computer gaming industry issues. All of them highly knowledgeable on matters Trek.

Onward, into the year. Habits will form. I’m sure of it.

 

Here, a random Star Trek TNG piece of gossip: That replacement doctor – Dr. Pulaski – won’t come back in later episodes, because of the following, passively-worded reason:

Maurice Hurley, head writer and showrunner, did not like working with McFadden, and McFadden was fired at the end of the season 1. Diana Muldaur joined the production as the Enterprise’s new chief medical officer, Dr. Katherine Pulaski, for season 2.

Series creator Gene Roddenberry admitted that the Dr. Pulaski character did not develop a chemistry with the other characters, so McFadden was approached to return as Dr. Crusher for the third season. At first, she was hesitant, but after a phone call from co-star Patrick Stewart, McFadden was persuaded to return to the role, which she subsequently continued to play through the remainder of the series.

Source: the Gates McFadden wikipedia entry

 

Export data from MongoDB 3.0 and import it into MongoDB 2.6

I just had the dream meeting with a dockerized app. This is a use-case for containerization.

I had to use MongoDB 3.0 to export some data and place it in a 2.6. Ouch: The 2.6 can’t connect to the 3.0.

A genius sat down at the exact moment of my complaining. “Just use Docker.”

So – to get and run the 3.0 Docker image of MongoDB – started with a bash shell:

docker run -ti --rm -v /Users/olle/slask/portal-test:/backup mongo:3.0 /bin/bash

(Related: this works because of the last line in its ENTRYPOINT script: entrypoint.sh (see 3.0/docker-entrypoint.sh))

When that pulls down and starts the shell, things are automated, and we land in a shell.

There, we can do the 3.0 invocation of mongodump:

mongodump -h ds012345-a0.mongolab.com:12345 -d portal-test -u portal-user -p secret -o /backup/portal-test

This gave me a dump folder at:

/Users/olle/slask/portal-test/portal-test/portal-test

I had been too generous with the wrapping folders. No biggie.

Then, the local 2.6 invocation:

mongorestore -d portal-test /Users/olle/slask/portal-test/portal-test/portal-test

Bam.

A disciplined method to fix Rubocop TODOs

Here, a listicle of advice you never asked for!

A method to locate and fix linting errors in Rubocop one at a time:

  • choose a “cop” from the .rubocop_todo.yml file
  • remove the line which includes the .rubocop_todo.yml
  • rubocop --only Lint/IneffectiveAccessModifier
  • fix issues and test fixes with the --only invocation
  • rubocop --auto-gen-config to regenerate the .rubocop_todo.yml file
  • git checkout .rubocop.yml to add back include line
  • commit this change with the name of the “cop” used

I really really tried to use it at FakeFS.

Game prep

See, this list of game preparation is soon a year old:

  • Buy skulls as game chits.
  • Invite players and nag them to confirm attendance.
  • Read the text of the scenario to play.
  • Try out the conflict system. Practice run.
  • Realize the rules are many but smart.
  • List game activities.
  • Figure out snacks and food.

 

Hold Elasticsearch back on 2.4 for Chewy on macOS

So, Elasticsearch 5.0.0 is out.

I’m using Chewy, a Ruby DSL gem that’s quite good. It’s not quite there yet with the deprecated Query APIs, (see Release Notes), so to keep using it you need to hold back to the Elasticsearch 2.4 series which is the last stable release. (Question about this, answered in the issue tracker.)

Smoky mountain top
Every upgrade is a damn climb. Yesterday was a good day. It was until 15:00 before I came upon package management issues.

Here are a couple of Homebrew invocations to stay on 2.4, with full data-loss, since the index version has been upgraded.

curl localhost:9200

The output for 5.0.0 is:

{
  "name" : "61mKC-n",
  "cluster_name" : "elasticsearch_olle",
  "cluster_uuid" : "EtOE3oHxQRSur5_qhRr-TQ",
  "version" : {
    "number" : "5.0.0",
    "build_hash" : "253032b",
    "build_date" : "2016-10-26T04:37:51.531Z",
    "build_snapshot" : false,
    "lucene_version" : "6.2.0"
  },
  "tagline" : "You Know, for Search"
}

This is the version 5.0.0 that we can’t use with Chewy.

brew services status elasticsearch

If you don’t have one of those services running, you can skip a few steps.

brew services stop elasticsearch
brew unlink elasticsearch 

Now, with a clean slate, we can ask Homebrew to install a specific version (actually, a release series, “2.4”):

brew tap homebrew/versions
brew install elasticsearch24 
brew services start elasticsearch24

If you now run curl localhost:9200 you’ll may perhaps not see JSON ouput: if your index has been running on the 5.0.0, its lucene_index version would’ve been bumped, so that 2.4 can no longer read it.

The data files used by your Elasticsearch resides in /usr/local/var/elasticsearch. My node was called elasticsearch_olle and I decided to move all of its data out to a non-important folder on my harddrive:

mv /usr/local/var/elasticsearch/elasticsearch_olle ~/slask

After that, I could

brew services restart elasticsearch24

and then curl localhost:9200, to see the following 2.4 output:

{
  "name" : "Miss America",
  "cluster_name" : "elasticsearch_olle",
  "cluster_uuid" : "ym_Ovns4RHWqh-ZRcHWxcw",
  "version" : {
    "number" : "2.4.1",
    "build_hash" : "c67dc32e24162035d18d6fe1e952c4cbcbe79d16",
    "build_timestamp" : "2016-09-27T18:57:55Z",
    "build_snapshot" : false,
    "lucene_version" : "5.5.2"
  },
  "tagline" : "You Know, for Search"
}

“Winning.”

Photo credit: Kamil Szybalski

TODO: Buy a Cinemateket Malmö card today!

Analog film is dying. Digital film is slow in creation. A percent perhaps, of all film, is digitally available. It’s becoming harder to borrow film from archives.
So, not even the places showing old films know how it’s going to be.
Therefore, Malmö’s wonderful film club Cinemateket will sell half-year memberships now. 200 SEK to see all films – in a perfect program – until the end of 2016.
“Stora Cinemateketkortet är ett förmånskort för dig som besöker oss riktigt ofta. Med det går du gratis på våra ordinarie filmvisningar. Stora Cinemateketkortet gäller i ett år och kostar 400 kr. Eftersom vi inte vet hur Cinematekets verksamhet i Malmö kommer att se ut 2017 säljer vi denna höst förmånskort som endast räcker året ut (oavsett när du köper dem). I gengäld säljs de till halva priset.”
Do it!

Pep before PolyConf 2016

I sat down to prepare to get to this year’s PolyConf. I trawled last year’s photos, and found these shots.

(All these photos were taken by polyconf. They’re also All Rights Reserved. So, just links instead of using the awesome paste-a-link-with-oEmbed-support.)

Me having coffee – hair-styling is remarkably difficult in high summer heat.

Will, author of Reasoned Schemer, having coffee sadly, this is the most flattering shot of hime in the collection. The spirited Salt Lake City hacker, whose infectious enthusiasm warmed many conversations was a kind soul and a C64 alumnus. Since then, I’ve bought his book, read half of it, perhaps. It’s mind-bending and kind at the same time. Progress must be slow, as I’m doing this on my own.

Portrait of Robert Virding, author of LFE is tending his language and its community at lfe.io, a decidedly sub-cultural website. For instance, by translating other people’s blog posts on Erlang to be about LFE. My computer still has an LFE install.

Robert Virding directing Joe Nash (Joe, who presented many a hackathon the world over)

Alban, conf org is a French cinéaste as well as a person who computers a lot

The Bodil birtday cake happened during the closing party

Portrait of Stefan Karpinski, Julia author who’s so genuinely nice that I believe he can run a programming language community. I also believe that community can open “fast execution” to many more people.

Portrait of Leah Hanson, Julia hacker who is working on Learning Julia for O’Reilly (to be published in 2016).

Portrait of Amir, MirageOS hacker who was also kind and nice. This community of communities is incredible like that. See Amir’s homepage and the MirageOS Marrakech Spring 2016 hackathon reports. for more notes and inspirational links.

Portrait of José Valim, Elixir author is someone I never spoke to, but his output is consistent and excellent.

Portrait of Jessica Kerr, functional programming hacker made (and makes) great talks. See her talks page.

 

 

KTHXBYE!

Architecture Decision Records

adr-tools is a set of command-line tools by Nat Pryce that allows people in software projects to record their design decisions, as they go along, a bit like the captain’s log on the Enterprise. (Much more on such log entries at this wiki for trekkers.) You type adr new Use Postgres, and it creates a formatted Markdown file, with headings in place, ready for you to fill out, fleshing out your reasoning for the decision.

Computer book author Michael T. Nygard explains ADR in much more detail in this 2011 blog post. (Book tip: Release It! Nine years old, now. It’s about the things you have in production.)

That library, I had the good fortune to be able to make a Homebrew package (a tap, in the parlance) for it, making installation of the tool a one-liner. This is such a package: homebrew-adr-tools. That’s all there’s to it, just a homebrew-prefixed GitHub repository with that file in it.

I tried adding a feature to the tool, which led me to its outout-expectation-based test suite with its attendant Makefile. Pretty decent experience using them!