I’ve been using Google’s various services as much as the next person, but around the time I’ve decided to buy my FairPhone, I’ve also realized I don’t want gapps (the base package of Google Apps, including your Google account on android, the Play store and Gmail) on my new phone and that I’ll have to find some alternatives.
There are many good reasons to limit your exposure to Google. If you’re reading this, you’ve probably already decided it’s worth thinking about, so I’ll focus on the how, rather than the why.
First of all, my goal isn’t to stop using Google, but to limit it’s access to me, especially on my phone. Using gapps, Google has access to practically every aspect of my phone’s state. The first step was to not install gapps on my new FairPhone. Thankfully, gapps isn’t preinstalled on the FairPhone’s default ROM.
Based on several articles, with varying degrees of OSSpurism, I’ve set up the following:
1Mobile Market contains the vast majority of free android apps, but doesn’t make any promises with regards to compatibility, the way Google Play does.
It also doesn’t let you buy the pro versions for many apps (like Titanium Backup)
It does manage version updates and manages your APKs reasonably well.
F-Droid is another alternative, but it focuses solely on OSS free software (as in both beer and speech). I find it a bit limiting in its purism, but it’s still very a very useful resource, listing apps’ anti-features (example).
Since I’m not using Google Play, I bought the license for the pro version directly off the author’s website and paid with PayPal. expect about 30 minutes wait between payment and receipt of the license file by email.
Export the google contacts from your phone (exporting from the website, doesn’t include the avatars) to a VCF file, which you can then upload to Fruux.
I’ve had a bit of trouble with DOB records, which I had to manually remove from the file with a text editor (notepad++ FTW)
After merging all my contact info manually on Fruux website (I exported all contacts, so many of them had duplicates from skype and other accounts), I had a decent contact list, ready to be used on my phone.
Note that the VCF export on the phone uses low quality avatar images. I haven’t figured how to export these correctly, so I replaced the LQ images with better ones for the contacts I cared enough about (close friends and family) on my phone. The LQ avatars look just as good as the originals as thumbnails.
Export your Google calendar to file, import on fruux. Nothing much to it.
Right now Fruux doesn’t support consuming CalDAV calendars from other providers, like your usual religious and national holiday, but you can import them into Google and export as file, so it should be good enough for the upcoming year or so, till they hopefully release that feature.
You can actually use Google Maps even without a Google account set up on your phone, and I do. It simply has the best POI for traveling abroad, coupled with public transit navigation, it’s really hard to beat.
Moovit (on 1Mobile Market) claims to be a viable alternative for public transit navigation, but doesn’t seem to have predefined POIs or the ability to save your own POIs.
You can easily export maps from Google’s MyMaps maps into KML file format, which oruxmaps supports. It’s not as dynamic as you’d be able to get using native Google My Maps (save on desktop, load on mobile), but it’s definitely close enough.
I’m using TeamCity for both building and deploying my artifacts to the different environments (DEV, CI, PROD, etc…).
I’m also using the daily Build History Clean-up feature to conserve disk space.
This means I might end up deploying to QA for manual testing, by the time the tests are done (in some cases this could take over a couple of days), the cleanup may have already removed the release candidate build (and its artifacts) from the disk, having to compromise between uploading a release that’s not fully tested to delaying the release on account of unplanned testing.
The solution is build pinning, this is equivalent to Jenkins’ “keep this build forever” button.
But you can’t ask a build configuration to pin you build automatically on success , and in my case, I want to pin the build I’m depending on (I have a build config that builds artifacts and other build configurations to deploy those artifacts to the various environments).
This plugin provides the build start timestamp as a parameter, which is great
The timestamp format isn’t easily configurable
Installation is again non-standard
The plugin notes say it’s a bit of a memory hog, and it was meant as a demo anyway
So I just wrote my own.
Formatted Date Parameter provides a confguration parameter (named build.formatted.timestamp), which during build will contain the build start timestamp. The timestamp format is ISO-1806 by default (“yyyy-MM-dd’T’HH:mmZ”). The timestamp format can be configured using another configuration parameter (named build.timestamp.format), which uses standard SimpleDateFormat syntax.
Even a snazzy networked NUT setup, covering all your windows and linux boxes at home, making sure they all shut down when your over-kill UPS goes into the Low Battery state can seem like a waste of time if you have to come back home and start everything manually before your batcave is back online.
Enter the Raspberry Pi. This is a linux box that will immediately start booting when your power comes back on, assuming you’ve connected it directly to the outlet and not through your UPS. All that’s left is just to wake the rest of your boxes once the Pi’s booted.
In my case I wrote a short script and put it in /usr/local/scripts/wakeEmUp.sh
echo "Waking up everybody"
wakeonlan -f /etc/homeLan.wol
The sleep in there will make sure you won’t start your other computer before a full 10 minutes have passed after the Pi finished booting. this will prevent “flapping”.
/etc/homeLan.wol is a file in the following format:
# This an example of a text file containing hardware addresses
# File structure
# - blank lines are ignored
# - comment lines are ignored (lines starting with a hash mark '#')
# - other lines are considered valid records and can have 3 columns:
# Hardware address, IP address, destination port
# the last two are optional, in which case the following defaults
# are used:
# IP address: 255.255.255.255 (the limited broadcast address)
# port: 9 (the discard port)
Finally, we need to actually run this when the Pi boots, this can be easily done by adding the invocation to /etc/rc.local:
# This script is executed at the end of each multiuser runlevel.
# Make sure that the script will "exit 0" on success or any other
# value on error.
# In order to enable or disable this script just change the execution
# By default this script does nothing.
# Print the IP address
_IP=$(hostname -I) || true
if [ "$_IP" ]; then
printf "My IP address is %s\n" "$_IP"
I have two UPS units at home, powering my batcave, which will get its own post sometime. Check_MK’s agent architecture really appeals to me, I naturally wanted to leverage it into the solution.
Basically, the Check_MK Agent is a small bash script that dumps a whole lot of plain text info out to STDOUT, this output is usually exposed over the network using xinetd, so when you open a socket on the port (6556 by default), with a telnet command, for example, you just get the whole output dumped right out.
This means that adding output is very simple, especially as the agent will try to run any executable files in the plugins dir, you can just read the agent script to see where that is on your installation. Creating the following script in the plugins dir, simply adds the output to the text dumped by the agent when it’s being invoked.
if which upsc > /dev/null 2>&1 ; then
for ups in $(upsc -l)
upsc $ups| sed "s,^,$ups ,"
I’ll go into further detail on my NUT setup in a later post.
Now we have the data which is great, but Check_MK can’t make heads or tails of it. We need to parse it on the server side. Since Check_MK uses a python API, we are forced to use python to parse our data. The Check_MK parser should be in your check_MK server’s checks directory.
Since the python script itself is a bit on the long side of things, you can find it in my Check_MK plugin repo on github. This script has been created for Check_MK v1.1.12p7, but if you’re interested in adapting it to the latest (and more elegant) API, you’re welcome to fork it on github or ask me to do it.
The end result is quite pleasing and allows me to easily track my UPS load, get live email alerts about power outages at home and have all this data collected into RRDtool graphs through PNP4Nagios.
A few months ago, I got a new desktop and handed down my previous desktop to my wife as a gaming rig to complement her day-to-day laptop.
Ending up with two computers, but with the same single set of speakers we had before raised the question of how to connect the audio from both computers to the speakers.
One’s immediate instinct is to just connect all the wires and let ‘er rip. However, that’s a very bad idea.
This post explains exactly why this is a bad idea and describes a simple mixer circuit that sums together two inputs into one output. However, it doesn’t allow any mixing control.
In case you’re not familiar with circuits and whatnots, check out sparkfun.com for some tutorial goodness.
This doesn’t deal with a very common situation where the volumes on the two inputs might not be set to comparatively reasonable levels, causing the output to effectively feel one sided. with no means to control the mixing ratio, I felt it wasn’t good enough. Additionally, if one of the inputs is connected to a computer that’s shut down at the moment, some static noises might be annoying, so an option to disconnect one of the inputs temporarily might be nice.
My design required a few specific components, for example the two pots are actually a double potentiometer (two pots on a shared axis) affecting both left and right channels in the same intensity. the switches are also double switches (123circuits only had triple switches) so both U1_S1 and U1_S2 should connect and disconnect together, without actually touching one another electrically (crossing the streams, sword-fighting if you will).
It took me a while to collect everything I needed, including buying solder for the first time in like a decade. but last week I was finally ready to build.
here’s the final result:
Using it is as intuitive as pointing the knob in the direction of the input you want to be more dominant. I skipped on the switches as the ones I had from dx.com were a bit flimsy and messed the audio a bit. I will probably post an update once those are added.
The actual inputs are aligned with the labels and directionality of the pot, so it should all just “make sense” to the user. Output is in the middle.
This was my first soldering work in over a decade so I won’t be posting any close ups of that poor poor PCB.