It’s hardly travel until I consider how much time I’ve spent out and about in Arkansas since we moved here. Monday I had the day off and Mattie and I went out to Devil’s Den State Park. I’d read that the park could be crowded, but we hardly saw another soul. We trotted around the 1.7-mile main loop and poked our nose and camera into the limestone formations.
Author: jonathanb
-

Mandolin Cairns
I picked up playing music late in life. If I have any innate musical ability, it’s buried deep down waiting to be found. In my thirties I started playing electric and string bass. It was fun. I picked up some basic jazz and blues tricks playing with friends and through lessons, but some fundamental facts involving gravity and bass-clef rhythm instruments made practice less than fulfilling.
Walking by Tejon Street Music on my way to a bass lesson, I noticed a mandolin in the window. How shallow can you get, wanting to play an instrument because of appearance, but I was tired of lugging amps or a 7/8 string bass and I was drawn to the arcs comprising mandolin profiles…. I quickly learned that the mandolin is tuned the same as a bass…. except upside down, and player contribute to both melody and rhythm parts. Practice could be fun!
Many hours of practice, lessons, and hacking later, I still haven’t found my talent, but I’ve had fun along the way. Here are some cairns I’ve left to trace my steps for my own reflection and hopefully to speed others’ journey to uncover their talents.
Models and teachers: It helps to hear different voices and different techniques from musician’s at all levels. The masters like Chris Thile and Jethro Burns still wreck my brain trying to figure out what’s going on. Having help breaking down what David Grisman and Bill Monroe makes many other licks and songs approachable at higher speeds. Hearing mortals play at the farmer’s markets, teachers, lessons, and instructional videos ease acoustic digestion.
Play with others: Metronomes make dull partners, but finding a patient group of peers will help you play to someone else’s time, expose your ear to the other voices of the same song…. The Mandolin Orchestra of Northwest Arkansas are wonderful colleagues and teachers. iReal Pro is another tool to help keep time during practice.
Projects: I’ll be posting some simple projects here that have served as launch points. Sweet Georgia Brown, Black Orpheus, and Hotel California are some landmarks I’d like to explore in future posts. Picking up a fake book is an easy way to explore new tunes and get started.
Experiment with strings and picks: Mandolins are expensive! String and pick choice have a lot more bang for the buck than laying out a mortgage payment for a new mandolin. After five years of experimenting with a clunky Breedlove KO (and waiting for Oregon production to end), I think I’ve finally dialed it in with Thomastik strings and a casein pick.
Learn to read: I wish I could hear better, but reading music gives me a jump start. Seeing lots of notes laid out in front of me used to be intimidating. Read Bach! The cello suites transcribed to the treble clef are a good start. Handel and Telemann wrote lots of music appropriate for mandolin (flute and violin). Seeing the music also makes theory more approachable, which helps improvisation, memorization, and accompaniment. Reading isn’t a panacea, so make sure to trust your ears.Mandolin Cairns
-
Exploring Python with Data
In the glut of Python data analysis tools, I’m sometimes embarrassed by my lack of comfort with Python for analysis. Static types, Java/Scaladoc, and slick IDEs in concert with compilers provide a guides that I haven’t been able to replace in Python. Additionally, the problem of dynamic types seems to exacerbate problems with library interoperability. With Anaconda and Jupyter, though, I can share some quick notes on getting started.
Here are some notes on surveying some admittedly canned data to classify malignant/benign tumors. The Web is littered with examples of using sklearn to classify iris species using feature dimensions, so I thought I would share some notes exploring one of the other datasets included with scikit-learn, the Breast Cancer Wisconsin (Diagnostic) Data Set. I’ve also decided to use Python 3 to take advantage of comprehensions and because that’s what the Python community uses where I work.
The notebook below illustrates how to load demo data (loading csv is simple, too), convert the scikit-learn matrix to a DataFrame if you want to use Pandas for analysis, and applies linear and logistic regression to classify tumors as malignant or benign.
share_breast In [7]:%matplotlib inline import numpy as np import matplotlib.pyplot as plt from sklearn.linear_model import LinearRegression import pylab as pl import pandas as pd from sklearn import datasets # demo numpy matrix to Pandas DataFrame bc = datasets.load_breast_cancer() pbc = pd.DataFrame(data=bc.data,columns=bc.feature_names) pbc.describe()
Out[7]:In [8]:from math import sqrt from sklearn.linear_model import LinearRegression from sklearn.linear_model import LogisticRegression # Plot training-set size versus classifier accuracy. def make_test_train(test_count): n = bc.target.size trainX = bc.data[0:test_count,:] trainY = bc.target[0:test_count] testX = bc.data[n//2:n,:] testY = bc.target[n//2:n] return trainX, trainY, testX, testY def eval_lin(trainX, trainY, testX, testY): regr = LinearRegression() regr.fit(trainX, trainY) y = regr.predict(testX) err = ((y.T > 0.5) - testY) correct = [x == 0 for x in err] return sum(correct) / err.size, np.std(correct) / sqrt(err.size) def eval_log(trainX, trainY, testX, testY): regr = LogisticRegression() regr.fit(trainX, trainY) correct = (regr.predict(testX) - testY) == 0 return sum(correct) / testY.size, np.std(correct) / sqrt(correct.size) def lin_log_cmp(n): trainX, trainY, testX, testY = make_test_train(n) # min 20 lin_acc, lin_stderr = eval_lin(trainX, trainY, testX, testY) log_acc, log_stderr = eval_log(trainX, trainY, testX, testY) return lin_acc, log_acc xs = range(20,280,20) lin_log_acc = [lin_log_cmp(x) for x in xs] pl.figure() lin_lin, = pl.plot(xs, [y[0] for y in lin_log_acc], label = 'linear') log_lin, = pl.plot(xs, [y[1] for y in lin_log_acc], label = 'logistic') pl.legend(handles = [lin_lin, log_lin]) pl.xlabel('training size from ' + str(bc.target.size)) pl.ylabel('accuracy');

