I spent a couple of days trying (and SPOILER 🙈 eventually succeeding) to install a 3rd party dependency on Heroku this week. I’ve not had to do this before as most stuff I’ve worked on has had a fairly vanilla setup.
First attempt to install was via the heroku-apt-buildpack as packages were available. This seemed to be the most straight-forward approach, but was quickly stymied by the fact that
aptcouldn’t download the public keys needed to validate the packages 😢
So I forked a custom buildpack from elsewhere and thought I was onto a winner until I realised that the buildpack included a statically built binary — not good as I couldn’t verify its authenticity. I decided to build my own static binary so I could have control, but a C++
cmakeheadache ensued where a static binary refused to be static 😭
I ended up with a dynamically build binary installed via the custom buildpack with its dependencies installed via
It’s not perfect, but it is done.
The default output from
cargo testisn’t great imo. It’s a bit verbose for my liking. It seems that rain thought the same and released cargo-nextest this week. I’ve only used it very briefly but it looks great so far and much more along the lines of other test runners I’ve used.
Announcing cargo-nextest: a new test runner for Rust projects! nextest has a beautiful user interface, several new features, and is up 60% faster than cargo test.
$ cargo nextest run Finished test [unoptimized + debuginfo] target(s) in 0.08s Starting 6 tests across 3 binaries PASS [ 0.039s] pushover_cli pushover::tests::invalid_user PASS [ 0.039s] pushover_cli pushover::tests::invalid_token PASS [ 0.039s] pushover_cli pushover::tests::five_hundred_error PASS [ 0.038s] pushover_cli pushover::tests::missing_message PASS [ 0.045s] pushover_cli::integration_test version_flag PASS [ 0.045s] pushover_cli::integration_test help_flag Summary [ 0.045s] 6 tests run: 6 passed, 0 skipped
Announcing a new crate assay which is a super powered testing macro for @rustlang !
asynctest support, per test env vars, automatic temp directory, auto ? in tests, setup and tear down functions and more! You can read more about how to use it here:
I haven’t tried this yet, but I’m about to do some work involving ENV vars so this is timely.
Apparently this is a thing that’s happening? 🤔
The South West Ruby meetup was back this week! I haven’t been to any sort of meetup in a long time for obvious reasons, and I had to push myself to go to this. I’m anxious at the best of times, and not being around large groups of people for a while has only made that worse. I’m really glad I went though!
There was a really interesting talk by Christian Buckmayer from Shopify who works on their CI - “Keeping Developers Happy With A Fast CI”. It wasn’t recorded but Christian also gave this talk at Rubyconf 2021 if you want to watch it.
It was lovely to catch up with people I hadn’t seen in ages.
The Kobayashi Maru of Comparing Dates with Times by Zach Holman
Don’t let anyone tell you that date maths is easy.
I’m really keen to starting using the latest Ruby 3.1 so I can take advantage of this error highlighting gem. I can’t help but feel like someone has taken a look at the error messages from Rust and tried to bring a tiny part of it to Ruby 🙂
test.rb:9:in `user_name': undefined method `' for nil:NilClass (NoMethodError) data[:result].first[:first_name] ^^^^^^^^^^^^^ from test.rb:14:in `<main>'
I don’t think any engagement-related metric is worth angering users in this way—even if it really does help users discover new content or stay subscribed longer. I’m reminded of the old saying, “People won’t remember what you said, but they will remember how you made them feel.” It applies to apps as well as people.
The incentives for product managers within companies are often in direct opposition to what a user wants from an app. Do you get a promotion/pay rise for driving up “engagement” by 20%, or for making users happy?