RASH: Advanced Shell History, and GIT usage

David G - DrupalIn my software development career, I’ve slowly been trying to better organize, document and streamline my personal development workflow. I’m well aware that BASH history exists and on all my boxes I tend to make the history file record an infinite amount of records. But, I find using BASH history … less than elegant. I daresay my individual project layout is approaching something like a per-project environment or a VM based development workflow (which I tried for a time) but currently I’m finding that RASH is a useful tool in almost any of the above contexts for software development.

What is RASH

It’s a cute 4 letter application name 😀 haha (what with Drush and Yum and Apt-Get, Rash fits right in). But no, seriously rash stands for Rash Advances Shell History.

A long time ago I was searching for an sqlite based shell history tool and found something on Google Code that used Sqlite, and later found Rash (back in 2013). After using Git a lot lately and building many projects I’ve renabled Rash on my system(s) and made more usage of it as late.

Rash is a Python daemon that stores your terminal history with environment details and other meta information in JSON files in your home directory. Periodically a RASH daemon service can ingest the collected JSON files into an organized sqlite database on your system. Then as needed you can (interactively) search through your RASH history in a smiliar — but more powerful means that basic BASH history.

For full documentation on how rash works and the basic Rash commands please see the great Documentation created for the project: https://rash.readthedocs.org/en/latest/

An example of the listing of SQLite tables Rash uses is below:

Rash SQLite table listing.

Rash SQLite table listing.

How Am I Using Rash

As I work with Git large projects more and more I find documenting things very important. More to the fact, I find that automating documentation for a project could be a big benefit for myself on a day-to-day basic but also to serve as documentation for people who could take over my projects in the future. So if Rash can record my history and be useful later on, why not use it?

So I created some commands in Bash my .bash_aliases file to more easily use RASH daily:

# Provide some useful Rash commands
function rash-count-in-day() {
    rash search --limit -1 --no-unique --time-after "1 day ago" | wc -l
function rash-today() {
    rash search --with-command-id --cwd-under . --time-after "1 days ago" --sort-by count --limit -1 > rash_daily_digest.txt
function rash-today-all() {
    rash search --with-command-id --time-after "1 days ago" --sort-by count
function rash-cwd() {
  rash search --with-command-id --cwd-under . "$1"
function rash-time-after() {
  rash search --with-command-id --time-after "$1" "$2"

So, as I work on any project under Git control prior to leaving work I like to run the command rash-today from my project root folder. This automatically generates a rash-daily-log.txt file which I can commit to Git. This way when I return to work on subsequent days if there was a command. If I really wanted I could add some Markdown markup to the rash output and have something pretty to look at in either Gitlab or Github.

But, the important thing is that the log of the commands I issued within my project folders was not lost to the sands of time should I revisit task(s) I completed this day.

An example of a rash-daily output can be seen below. In this project I am creating Bash scripts to leverage recordmydesktop and ffmpeg to automate recording/editing/archiving Nooma.tv classes I attend. So you can see some of the project development and commands I create in the last 24 hours. Some of these commands obviously are going to be placed into project scripts (such as the ffmeg concat command) but the point is not to loose this work I issued in the shell and not recorded in Git or easily inspected (for me) using basic Bash history, should I suddenly stop working for the day.

  328  ls
  244  ./noomatv_record.sh 0 15
  326  cd ..
  274  git status
  312  ./convert_ogv_to_webm.sh ./workflow/edits/nooma.2016-02-23.ogv ./workflow/final/nooma.2016-02-23
  323  clear
  233  ./noomatv_record.sh 1 2
  240  ./noomatv_record.sh 45 90
  325  ls -lah
  278  sudo apt-get upgrade
  239  ./noomatv_record.sh 60 90
  248  .n/
  283  cd footage/
  281  cd workflow/
  320  cd workflow/final/
  315  cd ~
  287  ffmpeg -f concat -i <(for f in ./*.ogv; do echo "file '$PWD/$f'"; done) -c copy working.ogv
  271  git add workflow/*
  273  git add workflow/.gitkeep
  275  git commit
  276  git push syno-gitlab master
  296  less convert_ogv_to_webm.sh
  253  mkdir -p workflow/archive
  255  mkdir -p workflow/edits
  257  mkdir -p workflow/final
  254  mkdir -p workflow/footage
  256  mkdir -p workflow/images
  251  mkdir workflow
  264  mv 02*.ogv workflow/archive/
  268  mv nooma.2016-02* workflow/footage/
  266  mv nooma.2016-02-18\:18\:58\:14.ogv workflow/archive/
  291  mv nooma.2016-02-23.ogv ../edits/
  289  mv working.ogv nooma.2016-02-23.ogv
  236  rm nooma.2016-02-23\:18\:30\:23.ogv
  277  sudo apt-get update
  252  touch workflow/.gitkeep
  258  touch workflow/archive/.gitkeep
  260  touch workflow/edits/.gitkeep
  259  touch workflow/footage/.gitkeep
  261  touch workflow/images/.gitkeep
  322  vlc nooma.2016-02-23.webm

I hope you can see how this tool can be helpful to record your daily workflow. Of course, this tool could use some polish (I don’t really care right now what apt-get commands I ran) but I believe it’s useful now to see what I ran today.

Note: As a further example, I’ve also taken to creating per-project drush shell aliases that for example issue a simple node count report per content type in drupal. If you work on a live site from daily snapshots you could track how the site node counts are changing daily in a simple report text file similar to my rash-today output above and accrue a set of small daily reports of how your website/project is evolving. It’s difficult to be more clear on all this as I’m actively developing my workflow(s).

Looking for quality web hosting? Look no further than Arvixe Web Hosting!

Tags: , , , | Posted under Uncategorized | RSS 2.0

Author Spotlight

David Gurba

I am a web programmer currently employed at UCSB. I have been developing web applications professionally for 8+ years now. For the last 5 years I’ve been actively developing websites primarily in PHP using Drupal. I have experience using LAMP and developing data driven websites for clients in aviation, higher education and e-commerce. If you’d like to contact me I can be reached at david.gurba@arvixe.com

Leave a Reply

Your email address will not be published. Required fields are marked *