Scala: Importing a Package and Aliasing Some of the Members
Mar 16th, 2014 by Brian Maso

I learned recently its easy to import a package in Scala while aliasing some of the package members. In my case, I had a few different imported packages each with a class/object named Hour. The following could be used to alias the Hour member while importing the whole package (“wildcard” import):

import com.somewhere.packageA.{Hour => A_Hour, _}
import com.somewhere.packageB.{Hour => B_Hour, _}
import com.somewhere.packageC.{Hour => C_Hour, _}

As one would expect, the type A_Hour within the scope of the import refers to com.somewhere.packageA.Hour, B_Hour to com.somewhere.packageB.Hour, etc.

Scala Refactoring: Quiet-my-scope with an Implicit Parameter vs. Pimp-my-lib with an Implicit Conversion Method
May 30th, 2011 by Brian Maso

The following two code snippets are equivalent, differing only on what the client code ends up looking like, but yielding in all other ways similar results:

// --------------------
// In LibraryCode.scala...
package library.code;
// client code expected to import the following method and provide
// an implicit Thing value for t -- a "quiet-my-scope" patterned method.
def addedFeatureToSomeThing(implicit Thing t) = ...
// --------------------
// In ClientCode.scala...
import library.code._
// An implicit Thing value
implicit val someThing: Thing = ...
// Use of the someThing value as an implicit parameter to the library method
val result = addedFeatureToSomeThing

The code above can be refactored to the pimp-my-lib style which follows, and vice versa:

// --------------------
// In LibraryCode.scala
package library.code;
// client code expected to import the following implicit conversion method,
// which effectively adds a new method "addedFeature" to the Thing class in the importing scope.
implicit def convertThingToAddFeature(t: Thing) = new {
    def addedFeature = ...
// --------------------
// In ClientCode.scala...
import library.code._
val someThing: Thing = ...
// Use of the pimp-my-lib-patterned convertThingToAddFeature implicit conversion method
val result = someThing.addedFeature

In both cases, the imported LibraryCode.scala provides a callable method which requires state in the form of a Thing object. I’m assuming the Thing type is also defined external to the client code — probably in a third-party library.

The first pattern, which to the best of my knowledge doesn’t have a name, you have a method in the local scope which accepts an implicit parameter, and effectively allows you to pass a single in-scope object to multiple methods without needing to repeat yourself. I’m going to dub this the “quiet-my-scope pattern“, because it reduces  the visual chatter that would otherwise be caused by passing the same object to multiple method calls — and I’ll just hope that name sticks.

The second is the famous pimp-my-lib pattern, which is well-known throughout Scalaland to be used to “add” methods to externally-defined classes.

Using implicits we have these two API patterns to choose from — they are equivalent, in so far as an implementation of either one can be mechanically transformed in to the other, only style and convenience being a differentiator between them. You might find it useful to keep around this alternative to the pimp-my-lib pattern, as in some APIs it will yield more readable, more convenient code.

Try it out: the next time you find yourself augmenting an existing class with methods using the pimp-my-lib pattern, take 5 minutes to refactor your solution to methods with implicit parameters (using the sample Thing code above as a guide). You won’t lose much time in the exercise, and you may find you’ll end up with better code.

Similarly, the next time you find yourself defining a method with an implicit parameter, take 5 minutes to work through what a refactored solution with a pimp-my-lib style method would look like. You may like the result quite a bit more than what you started with.

def addedFeatureToSomeThing(implicit Thing t) = …


Statically Controlling Calls to Methods in Scala
May 3rd, 2011 by Brian Maso

I’ve recently learned how to use two complimentary static techniques for controlling how many times methods are called in an API: Phantom Types, and Scala type constraints.

Why would you want to control the number of times a method is called? Consider, for example, the common Builder Object pattern. I don’t mean the classic GoF Builder Pattern — I use builders all the time, and I don’t at all recognize this class diagram describing what a Builder is supposed to be. By Builder I mean an object with a fluid API whose job is to collect up state for the purpose of creating one or more other objects, which is pretty common in Java and many Scala DSLs.

Rafael Ferriera wrote a piece about builders and Phantom Types I’m going to use as a starting point for introduction of the topic. I’ll start with his introduction of a ScotchBuilder: a Scala object that knows how to take a proper gentleman’s order for a scotch.

Let me quote Raphael to introduce the domain:

So, let’s say you want to order a shot of scotch. You’ll need to ask for a few things: the brand of the whiskey, how it should be prepared (neat, on the rocks or with water) and if you want it doubled. Unless, of course, you are a pretentious snob, in that case you’ll probably also ask for a specific kind of glass, brand and temperature of the water and who knows what else. Limiting the snobbery to the kind of glass, here is one way to represent the order in scala.

sealed abstract class Preparation  /* This is one way of coding enum-like things in scala */
case object Neat extends Preparation
case object OnTheRocks extends Preparation
case object WithWater extends Preparation
sealed abstract class Glass
case object Short extends Glass
case object Tall extends Glass
case object Tulip extends Glass
case class OrderOfScotch(val brand:String, val mode:Preparation, val isDouble:Boolean, val glass:Option[Glass])

You can imagine providing a ScotchBuilder class to generate immutable OrderOfScotch objects with a fluid API. Below is a first-pass at such a ScotchBuilder, which is your typical fluid implementation. Its nice, but we can do better. (Code, again, taken originally from Raphael’s post, modulo changing from stateful to stateless and fixing a couple of his typos)

case class ScotchBuilder(
    theBrand        :Option[String] = None,
    theMode         :Option[Preparation] = None,
    theDoubleStatus :Option[Boolean] = None,
    theGlass        :Option[Glass] = None) {
  def withBrand(b:String) = copy(theBrand = Some(b))
  def withMode(p:Preparation) = copy(theMode = Some(p))
  def isDouble(b:Boolean) = copy(theDoubleStatus = Some(b))
  def withGlass(g:Glass) = copy(theGlass = Some(g))
  def build() = new OrderOfScotch(theBrand.get, theMode.get, theDoubleStatus.get, theGlass);

There are two unattractive features with this Builder  that we are going to clean up:

  1. a client can re-invoke the same setter methods over and over
  2. the client can also completely forget to call other methods that should be called

Check this out, where I am able to make a non-sense scotch order:

Welcome to Scala version (Java HotSpot(TM) 64-Bit Server VM, Java 1.
Type in expressions to have them evaluated.
Type :help for more information.
scala> val b = new ScotchBuilder
b: ScotchBuilder = ScotchBuilder@6f77e5d4
scala> b withBrand "Cragganmore" withMode Neat isDouble false withBrand "Macallan 25 year" isDouble true build
res3: OrderOfScotch = OrderOfScotch(Macallan 25 year,Neat,true,None)

Notice I was able to specify the brand twice, and the size as both “double” and “single”. Quite ambiguous what I meant there. Considering the price difference between a single Cragganmore (cheap) and a double Macallan 25 (expensive!), that’s maybe an ambiguity we’d like to stamp out of the system.

I’m going to now show you how to use both Phantom Types and type constraints to ensure at compile time certain Builder methods are invoked:

  • with at-most-once semantics. E.g., withGlass should be called zero or one time by client code for a single ScotchBuilder instance.
  • with exactly-once semantics. E.g., withBrand, withMode, and isDouble each need to be called exactly once.
  • and, by the way, you can use this same technique to define one-or-more-times semantics. (Though I’m not going to go that far in this article. If you grok at-most-once and exactly-once, you will be able to figure out one-or-more-times.)

These techniques are not limited to just fluid Builder APIs. Pretty much any API where you wanted to constraint the call semantics can employ Phantom Types and type constraints to achieve the desired call semantics. This is going to apply to objects that traditionally walk through a lifecycle. For example, any API where there are init() and destroy(). The Builder under consideration also has a lifecycle: several configuration methods must be called, and then finally the build method gets invoked.

Continuing the Builder example, first I’m going to stop the build method from working if there are any builder values that haven’t been set correctly. That is, using Phantom Types and type constraints, I’m going to set things up so that compiler won’t even compile code that attempts to build a scotch when all the builder parameters have not been set. After that I’m going to stop the individual withXYZ methods from being called more than once using the same technique.

Here’s how I’m going to do make the build method uncallable except when the Builder has been fully configured: I’m going to add to the ScotchBuilder class one type parameter per method whose calls we want to track. The type parameters are going to track whether each of the withXYZ() methods have been called or not; the classes Zero and Once are defined to represent these two states. I’m then going to constrain the build method to only be callable if the appropriate type parameters are bound to the Once type.

So I’m adding 4 type parameters, each able to be bound to the type Zero or Once. So instead of having 1 ScotchBuilder class, I actually am defining 16. That is, 16 different permutations of the possible bindings to the 4 type parameters. The build method will then be constraining to be callable on ScotchBuilder[Once, Once, Once, _] (one of 2 specific bindings).

This first chunk of code adds the Zero and Once types to track the number of times the individual withXYZ() methods are called below. Note that the case class copy method allows you to specify type parameters, which I’m using in the implementation of each withXYZ method:

abstract sealed class Count
case class Zero extends Count
case class Once extends Count
object ScotchBuilder {
  def apply() = new ScotchBuilder[Zero, Zero, Zero, Zero]()
case class ScotchBuilder
    [WithBrandTracking <: Count,
     WithModeTracking <: Count,
     IsDoubleTracking <:Count,
     WithGlassTracking <: Count] (
    theBrand        :Option[String] = None,
    theMode         :Option[Preparation] = None,
    theDoubleStatus :Option[Boolean] = None,
    theGlass        :Option[Glass] = None) {
  def withBrand(b:String) = copy[Once, WithModeTracking, IsDoubleTracking, WithGlassTracking](theBrand = Some(b))
  def withMode(p:Preparation) = copy[WithBrandTracking, Once, IsDoubleTracking, WithGlassTracking](theMode = Some(p))
  def isDouble(b:Boolean) = copy[WithBrandTracking, WithModeTracking, Once, WithGlassTracking](theDoubleStatus = Some(b))
  def withGlass(g:Glass) = copy[WithBrandTracking, WithModeTracking, IsDoubleTracking, Once](theGlass = Some(g))
  // method definition with appropriate type constraints is just below...

And below is the build method, with type constraints guaranteeing the build method can only be called on instances of type ScotchBuilder[Once, Once, Once, _].

case class ScotchBuilder[...] {
  type IsOnce[T] = =:=[T, Once]
  def build[B <: WithBrandTracking : IsOnce, M <: WithModeTracking : IsOnce, D <: IsDoubleTracking : IsOnce] =
      new OrderOfScotch(withBrand.get, withMode.get, isDoubleStatus.get, withGlass)

I use the =:= type class to guarantee this constraint on the ScotchBuilder type parameters. An implicit value of =:=[A, B] only exists when A == B. (For a deeper explanation of the =:= type constraining object, check out this blog post by Debasish Ghosh). I’ve further created a type alias IsOnce[T] = =:=[T, Once], which allows me to apply =:= as a type class.

The upshot of all of this is that any attempt to invoke build on a ScotchBuilder not matching ScotchBuilder[Once, Once, Once, _] simply cannot be compiled. You literally cannot compile code that improperly uses a ScotchBuilder to build a order of scotch!

Note that in the code above we never actually create an instance of Zero or Once — these type parameter bindings are purely for compile-time bookkeeping. Hence the term Phantom Types, because these types are never instantiated nor participate at runtime.

We can even do a little better with the builder above by constraining the withXYZ methods to have exactly-once or at-most-once call semantics, as appropriate. This is going to make the compiler fail at the point where the API is being misused — i.e., where a withXYZ method is being used the second time for a single ScotchBuilkder instance. So it’ll be a lot easier when using this API to figure out what you did wrong. Here is the final version of ScotchBuilder:

sealed abstract class Preparation  /* This is one way of coding enum-like things in scala */
case object Neat extends Preparation
case object OnTheRocks extends Preparation
case object WithWater extends Preparation
sealed abstract class Glass
case object Short extends Glass
case object Tall extends Glass
case object Tulip extends Glass
case class OrderOfScotch(val brand:String, val mode:Preparation, val isDouble:Boolean, val glass:Option[Glass])
abstract sealed class Count
case class Zero extends Count
case class Once extends Count
object ScotchBuilder {
  def apply() = new ScotchBuilder[Zero, Zero, Zero, Zero]()
case class ScotchBuilder
    [WithBrandTracking <: Count,
     WithModeTracking <: Count,
     IsDoubleTracking <:Count,
     WithGlassTracking <: Count] (
    theBrand        :Option[String] = None,
    theMode         :Option[Preparation] = None,
    theDoubleStatus :Option[Boolean] = None,
    theGlass        :Option[Glass] = Some(Short)) {
  type IsOnce[T] = =:=[T, Once]
  type IsZero[T] = =:=[T, Zero]
  def withBrand[B <: WithBrandTracking : IsZero](b:String) =
      copy[Once, WithModeTracking, IsDoubleTracking, WithGlassTracking](theBrand = Some(b))
  def withMode[M <: WithModeTracking : IsZero](p:Preparation) =
      copy[WithBrandTracking, Once, IsDoubleTracking, WithGlassTracking](theMode = Some(p))
  def isDouble[D <: IsDoubleTracking : IsZero](b:Boolean) =
      copy[WithBrandTracking, WithModeTracking, Once, WithGlassTracking](theDoubleStatus = Some(b))
  def withGlass[G <: WithGlassTracking : IsZero](g:Glass) =
      copy[WithBrandTracking, WithModeTracking, IsDoubleTracking, Once](theGlass = Some(g))
  def build[B <: WithBrandTracking : IsOnce, M <: WithModeTracking : IsOnce, D <: IsDoubleTracking : IsOnce] =
      new OrderOfScotch(theBrand.get, theMode.get, theDoubleStatus.get, theGlass)

All this extra type bookkeeping is well worth it for me because it means I don’t have to write a bunch more test code. I never need to worry or test for cases where client is is misusing my ScotchBuilder: it simply is not possible to do. And there’s never a need to write regression tests against a case which is impossible.

Scala Word of the Day: View-Map-Filter*-Find
Apr 20th, 2011 by Brian Maso

A recent thread on the scala-users list discussed a nifty technique for efficiently working with a common sequence processing use-case. The technique was coin view-map-find. Here I describe what I think is a little better tweak on the idea that I’ll give the daunting name view-map-filter*-find. A good Scala programmer should have this in his toolbelt.

The Problem

You have a sequence, and you want to find the first (or any) element that matches some criteria. Of course your first thought should be find. However, often it is the case that the predicate passed to find would be pretty complex — perhaps too complex to render meaningful code. Often in these cases it would yield much more readable, understandable code to break the predicate in to two parts: the first portion is a mapping, and the second is a predicate test on the mapped value.

For example, you have a list of file names, and you want the first name that references a directory that contains at least one JPG image file. Using just find, you get a pretty hairy predicate function:

val fileNames: Seq[String] = ...
val dirWithJPGs = fileNames find { name =>
 val file = new
 if(file.isDirectory) {
  ( { fn => new, fn) }
     map { f => f.isFile && fn.endsWith("jpg") }
  ) reduceLeft (_ || _)
 } else { false }

Its pretty hard to see what’s going on there without staring at it for a bit. But the English description is pretty clear: Each name in the fileNames list is being mapped to a boolean value indicating whether or not the File is a directory AND contains at least one file ending in “jpg”.

Using maps and filters, we can get something a lot easier to understand in code:

val fileNames: Seq[String] = ...
val dirWithJPGs = fileNames map { new } find { dir =>
    dir.isDirectory &&
        (for(file <- { n => new })
             yield file.getName.endsWith("jpg")
        ) reduceLeft (_ || _)

I find that a bit easier to read. An additional filter call is going to make it even easier to read:

val fileNames: Seq[String] = ...
val dirWithJPGs = fileNames map { new } filter {_.isDirectory} find { dir =>
  (for(file <- { n => new })
       yield file.getName.endsWith("jpg")
  ) reduceLeft (_ || _)

First the map call translates the initial String values in to something suitable for testing ( instances). Then any complicated test is cracked in to a series of one or more filters, with the final predicate applied with a find.

The final trick, which is really necessary to make this work, is to use a non-strict view of the original sequence, not the original sequence itself. If you map the original (strict) sequence, you end up mapping all elements before testing any of them. And if you’ve cracked your predicate in to intermediate filters, then each filter would also need to be applied to all members. But if you map a view, then each element in the view is mapped, filtered, and finally tested individually before moving on to the next.

So, say your looking for the first item in a 100,000-element sequence that solves some predicate function — you’re going to want to map and test the items individually. Making a view of a sequence is as easy as using the sequence’s view method.

val fileNames: Seq[String] = ...
val dirWithJPGs = (fileNames.view) map { new } filter {_.isDirectory} find { dir =>
  (for(file <- { n => new })
       yield file.getName.endsWith("jpg")
  ) reduceLeft (_ || _)
“A co-Relational Model of Data for Large Shared Data Banks”
Mar 25th, 2011 by Brian Maso

Great ACM Queue article comparing SQL and NoSql systems. Interesting conclusions:

  • While SQL and NoSql models are very different, there are layers for querying functionally that make them more-or-less equivalent. That is, the key/value model of NoSql systems is a functional dual of Codd’s relational model.
  • Introduction of the term coSQL to refer to the key/value-based functional dual of SQL described as the model backing for “the most common noSQL databases” (Hadoop, Cassandra, Riak, Neo4J, as well as others mentioned explicitly as common NoSql systems)
  • Putting forth a “data-model formalization model” coupled with the idea of a monadicly-based query language as central themes that will develop within the coSQL world, and will drive the industry to a more economically efficient configuration.

Who knows if the predictions will come true, but the coSQL model is a great tool for comparing SQL and NoSql technologies on the axes of theoretical functionality and capability. I should have mentioned first the authors: Erik Meijer and Gavin Biermann. These guys are math-oriented to the extreme. We can trust in their development of coSQL as a dual of SQL, and in their conclusions about the theoretical comparisons of coSQL and SQL systems. Practical comparisons are a completely different matter, of course.

The Dunning-Kruger Effect
Mar 24th, 2011 by Brian Maso

This one is making the rounds at the offices of a client of mine. Great material for snark. From Wikipedia:

The Dunning–Kruger effect is a cognitive bias in which unskilled people make poor decisions and reach erroneous conclusions, but their incompetence denies them the metacognitive ability to appreciate their mistakes.[1] The unskilled therefore suffer from illusory superiority, rating their ability as above average, much higher than it actually is, while the highly skilled underrate their own abilities, suffering from illusory inferiority. This leads to the situation in which less competent people rate their own ability higher than more competent people. It also explains why actual competence may weaken self-confidence. Competent individuals falsely assume that others have an equivalent understanding. As Kruger and Dunning (1999) conclude, “Thus, the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others”

I’m ruminating on how I’ve been on both sides of that effect before. Being considered competent in one sphere of life (programming), I’ve made both mistakes described above:

  1. Over-estimating my ability to reason and come to good conclusions about subjects I’m really not very studied in. I’m been very guilty of spouting off about politics, economics, all sorts of things I’m not qualified to have an opinion on.
  2. Under-estimating what I have to offer. See how many posts I have in this blog? Not nearly as many as I should. I am prone to self-editing, assuming that if I find something cool or valuable in programming, it would already be really obvious to others, so much so as to make my post seem dumb.

Now that I have a name for the effect that’s in play I’m going to try to try to do less of the former, and more of the latter!

Happy Answer To the Ultimate Question of Life, the Universe, and Everything Day!
Oct 10th, 2010 by Brian Maso

101010 base 2 == 42 base 10

The Aquarian Age is come, evidentally. Hallelujah, Hari Hari, etc. But most of all, “Don’t Panic!”

Don't Panic

#UBP10 Hashtag Tracking
Apr 13th, 2010 by Brian Maso

Quietly, Blumenfeld & Maso has been building up a software portfolio for Twitter analytics of the past few months. Last night, we go more than our feet wet with live tracking and reporting for the #UBP10 party from 5 Minutes for Mom (thanks to @momfluential for setting that up.)

Some basic stats to give you a taste of how successful the #UBP10 Twitter party was: 15,704 tweets from 1,204 attendees, with combined reach 1,159,469 followers, and 29,397,282 unique messages in 7 days. Fully half that traffic came during the 2 hour #UBP10 party last night!

Over the next couple weeks we’ll be rolling out more and more of our tracking and visualization tools. (Please contact us for more info if you’re interested in hearing more!)

Hastag Tracking Statistics and Analytics

Twitter hashtag tracking statistics and analytics

15,704 tweets from 1,204 attendees, with combined reach 1,159,469 followers, and 29,397,282 unique messages
Non-Strict + Zip = Fab Fib!
Apr 7th, 2010 by Brian Maso

Just reviewed a gem from Programming Scala. Of all the Fibonacci implementations I’ve seen, my new favorite is below. Its 1 statement long, and there’s not a recursive function in sight:

lazy val fib: Stream[Int] =
 0 #:: 1 #:: => p._1 + p._2)))

If you have not seen that zip trick before, follow me on a little explanation. The code defines a Stream — a non-strict iterable — that begins with two literal values “0″ and “1″ — the “#::” operator creates a Stream with the left value as the head, and the right value as the tail. The stream then continues the Fibonacci sequence by zipping the sequence to itself. More specifically, to its own tail.

This figure illustrates what the zipper is creating.

Zipping a stream to itself to generate Fibonacci sequence

The tail value of the initial sequence is just the sequence starting with the literal value “1″. The zipper creates Pairs out of the each member of the sequence pairwise joined with the next member of the sequence.

The first pair is (0, 1).

The second pair is then (1, 0 + 1) = (1, 1).

The third pair is then (1, 1 + 1) = (1, 2).

The forth pair is then (2, 1 + 2) = (2, 3). And so on.

The coolest part is of course the complete lack of apparent recursion. The whole sequence is lazily evaluated, so the Stream takes up little space initially — though Streams in Scala are memoized, so once the N-th element is evaluated, its value is stored rather than computed again in the future.


We can generalize this stream-zipping technique. When the value of a Stream element n can be calculated from the previous k sequence members, we can use a k-ary version of this technique. That is, if the stream theStream member n can be defined by some function s:

def theStream(n) = s(theStream(n-1), theStream(n-2), …, theStream(n-k))

We can define the stream in a single statement thus:

  • Explicitly define the first k-1 stream members
  • For all other members, perform k-1 zips to create a TupleK of the previous k sequence elements.
  • A single closure then defines the next element from this Tuple.

Here, for example, the avg4_rolling function below is a Stream created of the rolling average value of the last 4 members of a Stream[Double] using this technique:

def padded_data(data: Stream[Double]) = Stream.fill(4)(0.0) ++ data ++ Stream.fill(4)(0.0)
  // Note: tail padding not a problem even if data is infinite.
/* Here's where the stream is joined to itself. Also,
   mapping the (((Int,Int),Int),Int) to (Int,Int,Int,Int)
   for readability. Can't be recursively defined   */
def zip4[A](str: Stream[A]): Stream[(A,A,A,A)] =
 (str zip str.tail zip str.tail.tail zip str.tail.tail.tail) map { p =>
     (p._1._1._1, p._1._1._2, p._1._2, p._2)}
/* Could be recursively defined in terms of
   base type Product */
def avg4(p: (Double,Double,Double,Double)): Double =
 (p._1 + p._2 + p._3 + p._4) / 4
/* Finally, generating the rolling-average stream */
def avg4_rolling(data: Stream[Double]): Stream[Double] =
 zip4(padded_data(data)) map (avg4)

You can use Iterator.sliding(n) to get the same effect. And that does work on infinite, non-strict streams. Personally, I just though this technique was so cool, and it does have the benefit of strongly-typed tuples. (Iterator.sliding() simple provides more Streams. Try it out if you’re curious.)

Logic, Fallacy, and Dobie Gillis
Mar 26th, 2010 by Brian Maso

Super-segue I’d like to take you on for no reason, other than to help you understand what its like to be me some times…

Just read “Programming and fallacies” on Michael Galpin’s blog.

Made me think of the very (very, very) old “Love is a Fallacy” humor writing by Dobie Gillis (book) author Max Shulman (– please read that some time, its such a riot). And when I say “old” I mean the original story was old when the old black-and-white Dobie Gillis TV show was on. In case you aren’t familiar, that was the show that launched the career of “Gilligan’s Island”‘s Gilligan star Bob Denver before Gilligan.

But my segue-adled mind doesn’t stop there, because Dobie Gillis is no doubt the best example of modern humor a logical mind can ever read. And who defined modern humor? No shit: Freud. You probably didn’t know he wasn’t famous in his early career not for his mommy-mangled psycho-sexual theories — his PhD thesis in fact was the seminal text on laughter and humor.

That I know all those bizarrely ancient pop cultural references, and you don’t,  means that

  1. I am cooler than you;
  2. I need more focused entertainment on Friday nights; and
  3. you don’t need to worry about competing with me because my brain is constantly routed down these fruitless tracts.
»  Substance:WordPress   »  Style:Ahren Ahimsa