Table of Contents
Main page: Robert Zaremba - Scale it blog | Scala tutorial for programmers
Scala
tutorial for programmers



Creative Commons License
Scala tutorial for programmers by Robert Zaremba is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
Based on a work at http://rz.scale-it.pl/docs/scala.html.

Version: 1.0, 2012

If you want to redistribute this work, please share this information with me, so I can link it too



Intro

The mixture of OOP & FP ( functional programming) and its concurrent features makes Scala an excellent bread earning language of the future. Since Scala provides concurrency in terms of "ACTORS" (yep, inherited from ERLANG) and with web development frameworks like LIFT, SCALATRA, PLAY it can be termed as a replacement of Java on the JVM in the coming years. Like Groovy you can use all existing java libraries, however the learning curve is little steep because of the complexity of the language. But as per my observation and as per the blogosphere world this language can fit extremely well in the Java world.

Even as per a blog the Groovy developer says "if he knows that Scala exists and is about to come then he would never had developed Groovy". This statement itself says a lot about Scala.

Scala is a powerful language. And the power can makes problems to people who don't know Scala enough. Here comes this tutorial - to make Scala more friendly for programmers as well as present Scala in a compact way.

JVM and Scala binary options.

Syntax

Scala Syntax Primer post by Jim McBeath.

Blocks

We can build block in two ways:

  ( expression 1;
    expression 2;
    expression 3;
  )

  // Second way:
  { expression 1    // don't need to use a ';'
    expression 2
    expression 3
  }
		
For method call with exactly one argument you can use curly braces to surround the argument instead of parentheses.

The purpose of this ability to substitute curly braces for parentheses for passing in one argument is to enable client programmers to write function literals between curly braces. This can make a method call feel more like a control abstraction.

Identifiers

Scala has two namespaces for identifiers: value scope and type scope. So this is possible to define to identifiers with the same name (one as a class, other as a val) and use them in the same block. However compiler expects that the identifier from value scope will be the companion object to appropriate identifier from type scope, if not it will make a warning:

  class X
  val X = 1
  new X              // returns new object
  X                  // returns 1
			
Identifier kind: A higher kind type with two type qualifiers, such as Pair[String,Int], can be written in infix

  var x : String Pair Int = (1, "aa")      // var x: Pair[String, Int] = ...
  type X = M1 + M2                         // for some type M1, M2, +[T, T2]
  type +[A,B] = Pair[A,B]
  Quantity[M + M2]                         // goes to Quantity[+[M, M2]]
  head :: tail                             // apply method in class ::[T](head: T, tail: List[T])
		  

Operators

Scala has great support for infix / prefix /postfix operators.
There is a great post about them, which I don't want to repeat. More about operators in subsection method - operators

Methods

When operator name ends with ":" then the operator is bind to right argument and left side is applied as an argument.

Default methods arguments have modifier val

return keyword is not required

The method return type, which don't return a value, is Unit. To make the method returning to return Unit:

  • Set return type to Unit
  • omit '=' sing after method head, eg: def f(){...}

Types

Base types

Char

16-bit unicode

Long

Literal ends with 'l' or 'L'. Example and properties:

  0xcafebabe < 0
  0xcafebabeL > 0
			

Double

Default type for floating-point variables

Float

Literal ends with 'f' or 'F'. Example and properties:

raw Strings

Strings, which preserves all char between """, eg: np: """Hi "Robert", what's up?"""
stripMargin="|" - line from left, below """" assign margin, for which whitespaces will be displayed.

Constants

final val x=5

scala.Symbol

Allows to use keywords for variable names.
syntax: `<alpha_num>*
eg: 'variable, 'yield, Symbol("symbol_name")

type

keyword type defines an alias to some other type:

    type Action = () => Unit
			
Action is an alias to the function type ()=>Unit

Null, Unit etc..

Scala has null value, but it's strongly recommend to not use null values. We have several null like values:
Null
- a trait
null
the only one instance of Null trait (tratis are abstract!)
Nothing
a Trait. Its a subtype of everything. But not super class of anything. There are no instances of Nothing besides Exceptions.
None
Option[_] concrete subclass. Used to represent a sensible return value. Just to avoid null pointer exception. Option has exactly 2 subclasses- Some and None. None indicate a missing value for an object. Option is a Monad like type, so it support safety taking elements
Unit
method return type which doesn't return any value

                     Scala.Any
                  /              \
                 /                \
           AnyVal                  AnyRef
         / |    | \               /  |   \
        /  |    |  \             /   |    String
       /   |    |   Unit        Seq  |
   Double  |    Boolean            \ |
          Int                      List
              ...                        ...
                                | /  /
                                 ...
         \    \   |  |         /
            \  \  |  |      Null
              \ \          /
                  Nothing

AnyRef == Object in Java.
Scala is a functional, so every function should return something, if not it returns Unit. This aims to difference between null and () - so that the function returns something (Unit) which is not connected to null reference.
Furthermore every object, if exists, should have a value. This indicates usability of None (None is an object with a concrete value).
Using Option has a lot of advantages over null
  • A None value unambiguously means the optional value is missing. A null value may mean a missing value or it may mean the variable was not initialized.
  • The fact that Option contains explicit methods to get (such as isEmpty, isDefined, get, getOrElse, map, filter) the actual value makes you think about the possibility that it might not be there and how to handle that situation, so you are less likely to write code that mistakenly assumes there is a value when there is not.
  • If your code assumes that there is a value and it is executed when there is not a value, then a NoSuchElementException exception is raised when Option is used. It is more specific than a NullPointerException and so should be easier to interpret, track down, and fix.
Example of Option usage with chaining Monad methods:

  def getSystemProperty(s: String): Option[String] = ...
  def loadPropertyFile(s: String): Option[String] = ...
  val x = (getSystemProperty("PROPFILE") flatMap loadPropertyFile      // Very nice implementation without boilerplate null checking
  x flatMap (_.get("TIMEOUT")) map (_.toInt) getOrElse 60)             // using monad methods chaining
				
Legacy Java code
Java methods returns null frequently. Java doesn't have Option type with so elegant monads method, because Java does not have function literals.
Some approach to manage Java methods is to write wrapper functions in Scala that call the Java functions and translate between null values and None values.
More about why null is bad: http://stackoverflow.com/questions/1274792/is-returning-null-bad-design.

Rich Types

Rich types are wrapper to normal types with useful operators and methods.
There are implicit conversions from standard types (from java) to rich types (scala) through implicit functions.

Conditional expressions

if

   if (e1) e2 [else e3]

	// if else part is empty then it evaluates to
   if (e1) e2 else ()
			
while

   while (e1) e2
			
do-while

   do e1 [';'] while (e2)
			
for loop
For loop is discussed here

Break, continue

There are now break and continue language support. However there is library support for break in scala.util.control.Breaks.
More in Programming in Scala e2 page 177

There is proposal to some syntax change for control statement which is going to be more clear with less parentheses.

Classes

Methods from class and it's companion object has access to private fields

Operators

Thanks to quiet comfortable access to method object (don't need to put '.') and set of allowed characters, operators are just class methods.
Good explanation about constructing operators, using them, and precedence can be found in Scala ref 6.12.3.

unary prefix operators

syntax: unary_<char>, where char is in {+,-,!,~}.

	class X(a: Int){
	  def unary_* = a*
	  def unary_- = a-1
	}
	var x = new X(2)
	-x         // OK
	*x         // Error, * not available as unary operator
	x.unary_*  // ok, method call 

unary postfix operators

Nothing special: methods without arguments and parentheses

Equity

The method == is predefined for any object as an alias to equals. If we want to overwrite ==, we should do it by overwriting equals method.
The == is not type aware since it is inherited from Any class, which equals head is equals(a:Any). So == takes argument of any type, and there is no compilation error when comparing not connected class, eg Fish("dolphin") == Office("desk"). The result will always be false. The there is only one help from compile - it will warn you when you use such comparison.

To escape from this behavior you can use === method from Scalaz library, which is discussed in Tools and libraries section.

The exception for == behavior are case classes, for which MyCase(pa1, pa2...) == MyCase(pb1, pb2...) corresponds to pa1 == pb1 and pa2==pb2 ...

reference equity

Functions below compare memory address, which we can't overwrite.
  • eq
  • ne

Constructors

Scala has only one constructor - primary constructor. But we can use helper methods in form of:

When we think about implementation of constructor and apply method of companion object, we need to remember that apply method companion object doesn't take part in inheritance and polymorphism

Primary constructor

The primary constructor is the body of a "free" code in class definition.
Each variable declared in primary constructor becomes the object field
Arguments of primary constructor are parameters of the class - the variables after parentheses of class name. The default scope modifier for them is private[this] val which will be discussed in protection scope paragraph. That means that by default we can't change the value of primary constructor arguments. We can do it by specifying whole declaration of them (eg: class X(var x: Int)).
If we don't specify the modifier of primary constructor arguments, and we will use them only in primary constructor body (not in class methods), then compiler won't add them as object fields. Otherwise they will become object values.

	class A (arg: Int){
	  def f(a: A) = {
	    println(arg)           // OK,  arg will be stored as a object member
	    println(a.arg)         // Error, arg is private[this]
	  }
	}
	

Auxiliary constructor

It is a method which name is this
The auxiliary constructor must ends with call to other constructor (at the end it must be primary constructor).

Private constructor

We add private keyword in front of constructor definition.
If it is a primary constructor then we add private modifier in front of the class parameter list. class X private(...)...
Private constructor is accessible only form other object methods (other constructor, methods or companion object methods)

Example


  class A (arg: Int, private[this] var xt: Int){
    //body of primary constructor is here:
    xt = some_function()         // xt is temporary value, doesn't occupy any memory
    var x1 = 1                   // mutable variable
    private var x2 = 2
    private[this] val x3 = 3     // private constant - used only by this object, not accessible from outside!
    val (x4, x5) = compute(arg)  // in this case, compiler will create hidden object field for tuple.
    val x6 = {                   // initialization using block code. All variables from block code are temporary
      val t = 3
      arg*t - xt*t
    }

    def this(arg: Int) = this(arg, 0)  // auxiliary constructor, calls primary constructor
  }


  object A {
    def apply(arg: Int) = new A (arg, 1)
  }
	  

Forcing type in auxiliary constructor

Consider the example:

  abstract class Expectation[T] extends BooleanStatement {
    val expected: Seq[T]
    …
  }

  object Expectation {
    def apply[T](expd:     T ): Expectation[T] = new Expectation[T] {val expected = List(expd)}
    def apply[T](expd: Seq[T]): Expectation[T] = new Expectation[T] {val expected =      expd }

  }
		
Here we explicitly define each apply to return Expectation[T], else it would return a structural subtype Expectation[T]{val expected: List[T]}.

Class initialization

We simply initialize a class using new keyword: new ClassName (constructor args...). Furthermore we can use helper methods from companion object which can perform initialization for us.

Initializing inner classes

There is special case about inner classes which is discussed in path-dependent types - Instantiating inner classes.

In general you can't instantiate an inner class without specifying an outer class instance.

Properties - getter and setter methods

To control variable reads and writes Scala has built in mechanism.
For every field declared as var or val Scala implicitly creates getter and setter functions for it (unless you explicitly make them) and changes the meaning of field. It goes as follows:

  class X {
    var celsius = 1
    val v = _

    def fahrenheit = celsius * 9 / 5 + 32  // getter without associated field
    def fahrenheit_= (f: Float) {          // setter without associated field
      celsius = (f - 32) * 5 / 9
    }
  }
  // goes to 
  class X{
    private[this] var celsius = 1
    private[this] val v2 = 0

    def celsius: Int = v1           // default getter 
    def celsius_(x: Int) = v2=x     // default setter 

    def v2: Int = v2                // default getter 

    def fahrenheit = ....           // the same as original
  }
	  
the defs v1 and v2 are implicitly created getters and setter. v2 doesn't have a setter method since it is val.
Here we special syntax: = _ which is default initializer, which initialize the variable for default one (numbers - 0, strings - "", boolean - false, reference types - null).

As we see in example we can define setter and getter without creating field

Casting

We can check runtime type of an object and cast to other type.

Special methods

Inheritance

Scala goes the same mode of inheritance as Java - we can have abstract classes, and every class can have only one base class.

To make inheritance we use extends keyword, which goes just after primary constructor declaration

primary constructor must call some base class constructor by applying arguments to base class name (or leave empty when there is constructor with empty arguments list).
Inheritance with repeated arguments
There is some hack when the constructor contain repeated arguments. There is special form to applying them.
Consider following examples.

Examples:

  class A1(x: Int) {
    val param = init(x)                       // Base calls methods to initiate x
    def init(x: Int) = x*2
  }

  class A2(x: Int) extends A1(f(x)) {
    override def init(x: Int) = x*4
  }
  object A2 {
    def f(x:Int) = x+3
  }

  val b = new A2(1)                           // Base primary constructor calls override init method
  println(b.param)                            // prints 16


  class A(name: String, vals: String*)

  class B(name: String, vals: String*) extends A(name, vals: _*)   // look how we apply vals! 
		

Self reference

Every class/trait scope has predefined this value which is self reference to current object of the class.
Using this we can get to object fields which could be hidden by current scope.

From Scala Ref. §6.5:
The expression this can appear in the statement part of a template or compound type. It stands for the object being defined by the innermost template or compound type enclosing the reference. If this is a compound type, the type of this is that compound type. If it is a template of a class or object definition with simple name C, the type of this is the same as the type of C.this.

Self reference aliases

We can make some aliases to this on class level scope.
This is useful when we are in inner class definition, and want get some field/method from the outer class. The following example demonstrate how to make it:

  class O {
    selfO =>
    val name = "O"
    trait I {
      selfI =>
      val name = "I"
      def test() {
        this.name           // refers to "I"
        selfI.name          // refers to "I"
        O.this.name         // refers to "O"
        selfO.name          // refers to "O"
      }
    }
  }
			

Self type annotations

We can make an assertion on a class to be of some type.
This creates inheritance dependency which must be satisfied when:
  • instantiating an object
  • creating subclasses

For example we want to create class Foo which must be also the type of Bar1 and Bar2:


  trait Bar1
  trait Bar2
  class Foo {
    self : Bar1 with Bar2 =>             // here we can change self name for whatever we want, but keyword
      ...
  }

  val x = new Foo                        // error: class Foo cannot be instantiated because it does not conform to its self-type Foo with Bar1 and Bar2
  val x = new Foo with Bar1 with Bar2    // OK
			
More info about self type annotations on scala pages

Curiously recurring template pattern

Here I present some interesting constraint on class using self reference type annotations

  // we want to create base parametric type which requires that in subtypes the type parameter will be the subtype itself:
  // So the constraint is that the subtypes must be the form of:
  //      S extends Base[S]

  abstract class Base[Sub] {
    self:Sub =>
    def add(s:Sub) : Sub
  }

  case class Vec2(x:Double,y:Double) extends Base[Vec2] {            // Ok
    def add(that:Vec2) = Vec2(this.x+that.x, this.y+that.y)
  }

  // attempts to cheat with inheritance won't work:
  case class Foo extends Base[Vec2]                            // error: illegal inheritance;
				
Covariance example
Sometimes we need a covariance types.
In the previous example, if class Base would have some filed, lets say base_field, and if in add method we would refer to that field, we get an error, that class Sub doesn't have such field. class Base has, but inside definition of Base we can refer only to fields from Sub.
The example below explain how to resolve this.

Consider a class Child with a method roomWith(aChild), which asserts that self and aChild are roommates. If there are two subclasses Boy and Girl, you will want to subclass this method in both with signatures roomWith(aBoy) and roomWith(aGirl) respectively, while doing the actual work (which is the same in both cases, presumably) in the Child method. The type system should check the correctness of every call.


  abstract class Child[C <: Child[C]] {
    self : C =>
    var roomie : Option[C] = None

    def roomWith(aChild : C)= {
      roomie = Some(aChild)
      aChild.roomie = Some(this)
    }
  }
  class Boy extends Child[Boy]
  class Girl extends Child[Girls]

  val b1 = Boy
  val b2 = Boy
  val c1 = Girl

  b1.roomWith(b2)
  b1.roomWith(c1)                      // error
				

Look the self type annotations section to get more information.

FOR, Generators, Monads

Methods on numbers: to, until, eg: 1 to 5

For expression

Generator

  for (seq) yield expr
			
For loop

  for (seq) expr
			
seq is a sequence of generators, definitions, filters with semicolons between successive elements.
generator, definition, filters are some types of match expression which is matched one-by-one all elements from the iterator. If the match fails (eg iterator is not a "iterator", or it's Unit, definition error ...), then is simply discarded from the iteration (instead throwing exception).

  for((a,b) <-range             // generator
               if  x > 10       // filter
               if ...;          // needs ; before nested expression
       y <- range if ...        // nested generator + filter expression
               CaseCl(tmp1, tmp2) = y.some_function;  // definition
               if (predicate tmp1)                    // other filter
  ) [yield] {
    block code                  // returns something
  }
			
We can use {} instead () to avoid putting ';' after each sentence

Caution the definition part is computed every time new value is taken.
If definition part doesn't depends on variables bound by some generator it is better to put it outside for expression:


  for (x <- 1 to 1000; y = expensiveComputationNotInvolvingX)   // BAD to put y here
    yield x * y

  // Better solution
  val y = expensiveComputationNotInvolvingX
  for (x <- 1 to 1000) yield x * y
			

For and high order function

Every for ... yield ... expression is translated to some composition of map, flatMap and withFilter functions.
Every for loop is translated to some composition of withFilter, foreach functions.

map, flatMap, withFilter expects some function as a first argument, which can be partial function
This is used to convert for expression to hight order functions


  for ((x1 , ..., xn ) <- expr1 ) yield expr2
  // is translates to:
  expr1 .map { case (x1 , ..., xn ) => expr2 }

  for (pat <- expr1 ) yield expr2
  // pat is general pattern, thus translation is a bit complicated:
  expr1 withFilter {
    case pat => true
    case _ => false      // this guarantees that match never throw a MatchError
  } map {
    case pat => expr2
  }
				

FOR generalization

As we've seen before, for expressions are translated to map, flatMap, withFilter, foreach composition. Thus it is possible to use for on every type which implements this kind of functions!
But it's also possible to define a subset of these methods, and thereby support a subset of all possible for expressions or loops. Here are the precise rules: Scala defines no typing rules for the for expressions themselves, and does not require that methods map, flatMap, withFilter, or foreach to have any particular type signatures.

Monads

Monad is a type that implements map, flatMap, withFilter methods.
From functional point of view monads can explain a large number of types with computations, ranging from collections, to computations with state and I/O, back- tracking computations, and transactions, to name but a few.

Furthermore, you can characterize every monad by map, flatMap, and withFilter, plus a “unit” (Monoid) constructor that produces a monad from an element value. In an object oriented language, this “unit” constructor is simply an instance constructor or a factory method.


All this suggests that the concept of for expression is more general than just iteration over a collection, and indeed it is. For instance, for expressions also play an important role in asynchronous I/O, or as an alternative notation for optional values. Watch out in the Scala libraries for occurrences of map, flatMap, withFilter — when they are present, for expressions suggest themselves as a concise way of manipulating elements of the type.

Exceptions

Exceptions are raised through throw keyword. Throw statement has a return type: Nothing and it match to every other type.

checked exceptions

Scala does not require you to catch checked exceptions, or declare them in a throws clause. You can declare a throws clause if you wish with the @throws annotation, but it is not required.

try catch expression

try-catch-finally returns value. The value is returned only from try (if no exception occur), or catch (if exception is thrown and catch). The value computed in finally clause is dropped. Finally should not normally change the value computed in the main body or a catch clause of the try. If a finally clause includes an explicit return statement, or throws an exception, that return value or exception will "overrule" any previous one that originated in the try block or one of its catch clauses. See more in Programming in Scala e2 (page 172).

	try {
	  val f = new FileReader("input.txt")
		...
	} catch {
	  case ex: FileNotFoundException => // handle missing file
		case ex: IOException => ...
	} finally {
	  file.close()
	}
	

Functions and Methods

Scala has methods (which are java derived type, part of some class), and functions, which are first order types.

Methods Functions values:
Methods are converted to a function only if the target type specifies it, or else if followed by '_'. This is arguably not very elegant, so why did we do it?
In fact the first version of Scala did not distinguish methods from functions, and could do partial application without the '_'. Unfortunately, this did not work very well in practice. Programmers often forget function arguments and Scala's subtyping discipline is too lenient to catch many of those errors. E.g. println("abc".length) might print <function> if you forgot the parentheses after length.

Quite good article about this can be found here

CAUTION: Converting from a method to a function loses parameter defaults.

FunctionX class

In Scala every function is instance of class FunctionX, where X is a number of function arguments.
So (_:Int) + (_:Double) is instance of Function1[Int, Double], and def f = 1 - is a constant function, instance of Function0[Int].

Scala have syntactic sugar for function types:
() => R stands for Function0[R]
A, B => R stands for Function2[A, B, R]
and so on ...

_ on functions

"someMethod _" wraps someMethod to function value, whose apply method is exactly someMethod.
"_" is used to make partial functions - to create new function value which curry missing arguments.
Example:

	val nums = List(1,2,3)
	val.foreach(println _)           // ok function value is required, println is converted to function value
	val.foreach(println)             // ok, compiler expects function

	def succ1(i: Int) = i+1
	val succ2 = { i: Int => i+1 }
	var p = succ1                    // ERROR succ is not a first class value
	var p = succ1 _                  // ok, succ is converted to function value by _ 
	var p: Int=>Int = succ1          // ok, compiler expects function
	   // succ2 == p ==succ1 _     ; both equal by type and semantic

	def method_g[T](x:T) = x.toString.substring(0,4)
	val fun_g = [T](x:T) => x.toString.substring(0,4)  // ERROR: functions can't be generic 

	    //**  but we can obey restriction from fg:  **
	class Cfun_g[T] extends Function1[T,String] {
	  def apply(x:T) = x.toString.substring(0,4)
	}
	val fun_g = new Cfun_g[String]
	fun_g("this is a string")
	

local functions and placeholders

the placeholder can be used only once per argument. Multiple underscores mean multiple parameters, not reuse of a single parameter repeatedly.

	someList.foreach(_ + _)
	  

Assertions, arguments validation

Partially applied functions

applying to none of arguments:

var f = someFunction _

applying to some arguments, eg function has 3 arguments of types String, Int, Int:

var f = someFunction("hej", _: Int, 3)

Implementation of val and var

val c=1 is implemented as a pair:

	private final int c;
	public int c();                    // getter
	  
var v=1 is implemented as a triple:

	private int v;
	public int  v();                   // getter
	public void v_$eq(int);            // setter
	  
val f = {x:Int => x+1} is implemented as a pair:

	private final scala.Function1 f;   // object to keep function value as a constant (f is a val)
	public scala.Function1 vf1();      // getter method to the function
	  
var f = {x:Int => x+1} is implemented as a triple:

	private scala.Function1 f;         // object to keep function value as a variable (f is var)
	public scala.Function1 vf1();      // getter method to the function
	public void vf_$eq( Function1 );   // setter
	  

Declaring function arguments

repeated arguments


  def echo(args: String*) = for(a <- args) println(a)
  def pass_to_echo(args: String*) = echo(args: _*)

  var arr = Array("jeden", "dwa")
  echo("jeden", "dwa")
  echo(arr)                       // Error
  echo(arr: _*)                   // OK
		

Named arguments


    def f(arg1: Int, arg2: String) = ...
    f(arg2="hej", arg1=1)
			  
It is also possible to mix positional and named arguments. In that case, the positional arguments come first.

Default arguments


    def f(arg: Int = 2)
			  

by name

Scala arguments can be passed by name. When an expression passed as an argument is not computed greedy, before a function call, but every time the argument is used in a function.

  def f(a: => Unit) = a;a                             // a is an argument passed by name
  f(println("hej"))                                   // outputs two lines of "hej"

  var actions: Listbuffer[()=>Unit] = ListBuffer()    // List with some operation to do
  def insert(condition)(operation: =>Unit) = {
     if(condition) actions.append(() => operation)
  }
			
In the previous example insert puts an unevaluated expression which type is Unit to the action list as a function, which evaluates this expression (use variable).

  var count = 0
  val e = { count += 1 }              // Fake. The expression is called, and e==1, count==1
  insert(true){
     println("incrementing")
  	 count += 1
  }                                   // nothing is print, count is not incremented
  insert(true)(println("hej"))

  actions(0)                          // returns a function which evaluates {println("incrementing"); count+=1}
  actions(0)()                        // count == 2
  actions(0)()                        // count == 3
  actions(1)()                        // output "hej" to console
			
Short-circuit evaluation of &&
The semantic of && is a method from Boolean class is like in other languages. The second argument is evaluated only when firs is true, and is done with argument passed by name, which look similar to:

  class Bollean(left) ...
    def  &&(right: ()=>Bollean) =
      if(left) right
      else false
				

Method overloading

Scala allows for method overloading.
Overloaded method are methods with the same name, but different argument types.
Method overloading is the mechanism to choose right method based on arguments type putted to method call.
Compiler will chose the method based on invocation arguments list, and look for:

Limitations of method overloading

Due to JVM type erasure Scala has the same limitations for method overloading as Java has.
For runtime system the both foo methods below has the same type:

  def foo(x: List[Int])
  def foo(x: List[Boolean])
				
So in client code runtime can't figure out which one to use. We can pass this limitation using union types described in tips section.
Quiet interesting solution is to use functions with different arguments number (so that functions after type erasure have different head), described in http://stackoverflow.com/questions/4982552/scala-method-overloading-over-generic-types.

Other possibility would be to use case classes family + pattern matching + implicit conversion from desired generic types to case classes. More: http://jim-mcbeath.blogspot.com/2008/10/polymorphism-using-implicit-conversions.html

Scala code guidelines avoid method overloading. The reasons are:

Traits

Using Traits

We use traits by writing keyword with just after class name and inheritance part.
It can be declared both in class definition or in class initialization.
If the trait has super class - then the trait can be only mixed into a class that also extends this super class.

Trait can has a super call on a method declared abstract. Such calls are illegal for normal classes.
Since super calls in a trait are dynamically bound, the super call for a abstract method in a trait will work so long as the trait is mixed in after another trait or class that gives a concrete definition to the method. This arrangement is frequently needed with traits that implement stackable modifications. To tell the compiler you are doing this on purpose, you must mark such methods as abstract override.

The order of mixins is significant.
The method call order is determined by linearization. Roughly speaking, when you call a method on a class with mixins, the method in the trait furthest to the right is called first. If that method calls super, it invokes the method in the next trait to its left (of from parent trait - not class!), and so on.
The linearization of class is computed from back (of the declaration order) to front as follows. The last part of the linearization is superclass. Traits can extend other traits. In that case the overwrite version of method is called first. (eg T2 extends T1; class X with T2 with T1; var x = new X, and all has f method, if T2 override f method then T2.f is called first in expression x.f)

Example:

  abstract class IntQueue {
     def get(): Int
     def put(x: Int)
  }

  class BasicIntQueue extends IntQueue {
     import scala.collection.mutable.ArrayBuffer
     private val buf = new ArrayBuffer[Int]
     def get() = buf.remove(0)
     def put(x: Int) { buf += x }
  }

  trait Doubling extends IntQueue {
     abstract override def put(x: Int) { super.put(2 * x) }
  }
  trait Incrementing extends IntQueue {
     abstract override def put(x: Int) { super.put(x + 1) }
  }


  class MyQueue extends BasicIntQueue with Doubling

  val queue = new MyQueue with Incrementing
  val queue = new BasicIntQueue with Doubling with Incrementing


  // example with parameter constructor
  class WithParameter(arg: Int) extends MyQueue with  Doubling
	  
Interestings

Interesting traits

Packages

Packages: compilation and runtime

The binary file structure depends on packages, not the source file structure. All definitions are compiled to exactly one binary .class file based on package in which the definition is placed

All the class, traits and companion objects from package Pack1.Pack1_1 in some module (file) are compiled to Pack1/Pack1_1 subdirectory of the output directory.

In order to run an class Main which is in package Pack1.Pack1_1 you need to run: scala Pack1.Pack1_1.Main
You need to assure that directory Pack1 is in your CLASSPATH by either be in directory Pack1 or add path to Pack1 to CLASSPATH through -cp option or setting CLASSPATH environment variable.

Common exceptions during running a main class
While trying to run a main class you can encounter following exceptions:
  • Exception in thread "main" java.lang.RuntimeException: Cannot figure out how to run target: Main
    JVM can't find class Main.
    Probably the class Main is in some package or the class file is not in your CLASSPATH. If you can, check in the source file if class Main is in some package or check your classpath. If you run Main class from the directory where Main.class file is, then probably it is a problem with package (but you should check classpath as well by running scala -cp . test.Main)
  • Exception in thread "main" java.lang.RuntimeException: Cannot figure out how to run target: test.Main
    JVM can't find class test.Main.
    You run test.Main so you explicitly call Main class from test package. The Main.class file is stored somewhere in directory test (This is a JVM packaging rule).
    Probably the test directory (the directory when class Main resist) is not in your classpath. Try to run class one more time by specifying classpath.
    Eg: you are in test directory itself. run:
    scala -cp .. test.Main
  • Exception in thread "main" java.lang.NoClassDefFoundError: Main (wrong name: test/Main)
    The class Main doesn't have main method.
    The class you are trying to run from JVM needs to have a main method.
    The main method in Scala must resist in an object and has the following definition:
    def main(args: Array[String]) { ... }
    The exception was raised because the class (object) doesn't has the main method or you try to call wrong class - forgot to specify package.
    Check the source file if there is an object with the main method and if the module is contained in package X.Y you need to specify it in runtime: scala X.Y.Main - assuring that X is in your CLASSPATH.

Accessing names in packages

If code resists in the same package, Scala allows to use short, unqualified names.
Otherwise you must use full names (like java.collection...).
When using the curly-braces packaging syntax, all names accessible in scopes outside the packaging are also available inside it.

If There are to packages ie: launch one in global scope, second in package bob, and you have a method that access package launch, then that methods refers to bob.launch.

If you want to access global launch you need to write:

	_root_.launch
		
Put another way, every top-level package you can write is treated as a member of package _root_.

Imports

An import clause makes members of a package or object available by their names alone without needing to prefix them by the package or object name.

  import bobsdelights.Fruit // easy access to Fruit
  import bobsdelights._     // easy access to all members of bobsdelights
	  

Special of import in Scala

Scala’s import clauses are quite a bit more flexible than Java’s.
  • may appear anywhere
  • may refer to objects (singleton or regular) in addition to packages
  • let you rename and hide some of the imported members
  • they can import packages themselves, not just their non-package members.
    For example, the package java.util.regex is imported. This makes regex usable as a simple name. To access the Pattern singleton object from the java.util.regex package, you can just say, regex.Pattern.
Examples:

  // import a simple name x. This includes x in the set of imported names
  import x

  def showFruit(fruit: Fruit) {
    import fruit._
    println(name +"s are "+ color)  // the same as fruit.name, fruit.color
  }

  // this import to objects from Fruits and renames Apple
  import Fruits.{Apple => McIntosh, Orange}

  // imports all names from Fruits and renames Apple
  import Fruits.{Apple => McIntosh, _}

  // imports all members of Fruits except Pear
  import Fruits.{Pear => _, _}
		

Implicit imports

Scala adds some imports implicitly to every program. They are:
  • java.lang._
  • scala._
  • Predef._
The Predef object contains many definitions of types, methods, and implicit conversions that are commonly used on Scala programs.

Access modifiers

Members of packages, classes, or objects can be labeled with the access modifiers private and protected.

Private members

A member labeled private is visible only inside the class or (and companion) object that contains the member definition. In Scala, this rule applies also for inner classes (In Java not). Class-private or object-private members may not be abstract, and may not have or override modifiers.
Caution
Private members can't be overridden. It is to prevent changes in class behaviour injected by descendants. The concept here is very similar to C++ one - each ancestor holds its own copy of members until a special methods (virtual inheritance, for example) is invoked.
So if we mixin two traits with the same private member (but different value) - the final class will have two different fields. It won't be override.

Protected members

A protected member is only accessible from subclasses of the class in which the member is defined. In Java such accesses are also possible from other classes in the same package.

Public members

Every member not labelled private or protected is public.

Scope of protection

Access modifiers in Scala can be augmented with qualifiers. A modifier of the form private[X] or protected[X] means that access is private or protected “up to” X, where X designates some enclosing package, class or singleton object.
They enable you to express Java’s accessibility notions such as package private, package protected, or private up to outermost class which are not directly expressible with simple modifiers in Scala.

Class labeled private[bobsrockets] means that is visible in all classes and objects that are contained in package bobsrockets, but all code outside package bobsrockets can't access this class

Class labeled private[this] allows access only from same object - any access must be made from the very same instance.
Interesting is a field type: private[this] val - nobody can modify it, and only the object itself has access to its value. There is proposal for optimization - that this field wouldn't take any memory space in object, and in places where it is used the value would be compiled in (as a temporary value).

Modifier protected[X] in a class C allows access to the labeled definition in all subclasses of C and also within the enclosing package, class, or object X.

Visibility and companion objects

A class shares all its access rights with its companion object and vice verse.

One exception concerns protected static members. A protected static member of a Java class C can be accessed in all subclasses of C. By contrast, a protected member in a companion object makes no sense, as singleton objects don't have any subclasses.

Class hiding

Private class
If the class is private we can't even use it as a type
Private constructors and private members are one way to hide the initialization and representation of a class. Another, way is to hide the class itself and only export a trait that reveals the public interface. We use Trait to get access to the type and forbid using new on it.

 trait Queue[T] {
   def head: T                              // clients interface
   ...
 }
 object Queue {
   def apply[T](xs: T*): Queue[T] =         // clients factory method hiding actual constructor complexity
     new QueueImpl[T](xs.toList, Nil)
   private class QueueImpl[T](              // True class inaccessible from outside
       private val p1: List[T],             // private parameters, inaccessible even from Queue companion object
       private val p2: List[T]
   ) extends Queue[T] {
     ...                                    // Queue implementation
   }
 }		
So when we create QueueImpl object through factory method from Queue companion object, we can only access to Queue type. So the QueueImpl object is visible outside as a Queue type object, and has access only to fields from trait Queue.

This techniques is used only when we want to hide whole class

Package objects

Any kind of definition that you can put inside a class (trait, class, function, variable), you can also put at the top level of a package.
To do so, put the definitions in a package object. Each package is allowed to have one package object. Any definitions placed in a package object are considered members of the package itself.

We make package object by writing: package object package_name {... }
The contents of the curly braces can include any definitions you like.

Package objects are frequently used to hold package-wide type aliases and implicit conversions. The top-level scala package has a package object, and its definitions are available to all Scala code.

  //  file gardening/fruits/Fruit.scala
  package gardening.fruits
  case class Fruit(name: String, color: String)
  object apple extends Fruit("Apple", "green")

  //  in file gardening/fruits/package.scala
  package gardening
  package object fruits {
    val planted = List(apple, apple)
    def showFruit(fruit: Fruit) {
      println(fruit.name +"s are "+ fruit.color)
    }
    implicit def fruit2string(f: Fruit):String = f.name + " " + f.color
  } 

Package objects are compiled to files named package.class which are the located in the directory of the package that they augment. So the package object fruits would be compiled to a class with fully qualified name gardening.fruit.package (Note that, even though package is a reserved word in Java and Scala, it is still allowed as part of a class name on the JVM level.

Assertions and Unit testing

Assertions

Assertions can be turned on/off (so the Assertions error are thrown or not by assert / ensuring ) using JVM command line flags -ea, -da

Predef.assert(assertion: Boolean, [message: => Any]) :Unit
if assertion falls then AssertionError is thrown with message (any object) as explanation. The assert will call toString on it to get string explanation.

ensuring is a method of class Ensuring.
There exists implicit conversion from Any type to Ensuring, so we can coll from Any instruction this method. The constructor takes an object to operate with while calling ensuring method
There are two methods declaration: While cond falls ensuring will throws AssertionError with optional msg as an argument. Example:

  var x = 2
  if(x<2)
    x
  else {
    val y: Int = 2
    y+x
  } ensuring ( _ >= 4, "x is >= 2 so y+x is >=4")
	  
the block after else returns (x+y):Int, which is converted to Ensuring with itself as a parameter. Then is applied to cond method ( x+y >= 4 ) and if the result is false exception is thrown.

Unit testing

Just use one of Java tools, or new tools for Scala as ScalaTest, ScalaCheck...

Case Classes and pattern matching

Special type of classes to support pattern matching.
Case classes has several conventions:

match

selector match{ (pattern => expression)* }.
Pattern can be: Scala uses a simple lexical rule for variable disambiguation: a simple name starting with a lowercase letter is taken to be a pattern variable; all other references are taken to be constants. To see the difference, create a lowercase alias pi form math.PI.
To use a lowercase name for a pattern constant, enclose the pattern in back-tricks (`pi`). Back-trick are use also to treat keyword as an ordinary identifier (eg `for`).

So when we want to use a variable to be used in match evaluation it must be uppercase, or lowercase wrote in form:


  var x = 2
  (1+1) match  {
    case `x`     => true      // x is bind to variable x
    case x       => true      // always TRUE!, x is taken as a free variable, and is bind to match expression (1+1)
    case X:                     => ...            // only accepts a value equal to the value X (upper case here makes a difference)
    case z if z==x => true    // z is taken as a free variable and bind to match expression (1+1). The match is true when "if" evaluates to true
  }	  
The exception is also in infix operators (eg: x::tail) : Expression part might be empty, then the return value is (): Unit
Example:

  def simplify(expr: Expr): Expr = expr match {
    case UnOp("-", UnOp("-", e))  => e                  // Double negation
    case BinOp("+", e, Number(0)) => e                  // Adding zero
    case BinOp("*", e, Number(1)) => e                  // Multiplying by one
    case List(0, _*)             => println("found it") // variable long sequence beginning with 0
    case UnOp("abs", e @ UnOp("abs", _)) => e           // when matching success e is bind to UnOp)"abs",_)
    case BinOp("+", x, x) => BinOp("*", x, Number(2))   // this fails, patterns must be linear.
                                                        // pattern variable may only appear once in a pattern.
    case BinOp("+", x, y) if x == y => ...              // pattern guard. Reformulation of upper
    case s:String if s(0) == 'a'    => ...              // pattern guard. Reformulation of upper
    case BinOp(op, l, r)  => BinOp(op, simplifyAll(l), simplifyAll(r))     // recursive match further

    // other matchers:
    case x: String              => s.length       // any String
    case x: Map[_, _]           => m.size         // any Map, we can't precise context eg Map[_, Int] because of type erasure
    case (x, y, ..., z)         => ...            // only accept a tuple of the same arity
    case Extr()                 => ...            // only accept if Extr.unapply(expr) returns Some(Seq()) - some of something/empty sequence
    case Extr(x)                => ...            // only accept if Extr.unapply(expr) returns Some(Seq(x)) or Some(Tuple1(x))
    case Extr(x, y, ..,z)       => ...            // only accept if Extr.unapply(expr) returns Some(Seq(x,y,...,z)) or Some(TupleN(x,y,...z)) - the same arity
    case x Extr y               => ..             // only accept if Extr.unapply(expr) returns Some(Seq(x,y)) or Some((x,y))
    case x | y | ... | z        => ...            // accepts if any of the patterns is accepted (patterns may not contain assignable identifiers)

    case _ => expr
  }	  

Type parameters in Match

From Scala specification:

A parametrized type pattern T [a(1), . . . , a(n)], where the a(i) are type variable patterns or wildcards _. This type pattern matches all values which match T for some arbitrary instantiation of the type variables and wildcards. The bounds or alias type of these type variable are determined as described in (§8.3).
...
A type variable pattern is a simple identifier which starts with a lower case letter. However, the predefined primitive type aliases unit, boolean, byte, short, char, int, long, float, and double are not classified as type variable patterns.

So if type parameter is lowercase it is taken as an free variable.

  case x Seq[a]  => ...       // this will match any Seq, and the type parameter will be bind to a
        
The conclusion is we can't specify type parameters with full qualified names (like java.lang.Integer).

If we need to specify type from some package, we need to make an type alias starting from Upper letter:
type JavaInt = java.lang.Integer

Sealed classes

In a pattern match, it's good to make sure you have covered all of the possible cases. In general, this is impossible in Scala, to compiler tell you which are possible cases, because new case classes can be defined at any time and in arbitrary compilation units. A sealed class cannot have any new subclasses added except the ones in the same file.
We make them by putting the sealed keyword in the very front of the class. This is very useful for pattern matching, because it means you only need to worry about the subclasses you already know about and you get better compiler support as well.

unchecked annotation

The @unchecked annotation has a special meaning for pattern matching. If a match's selector expression carries this annotation, exhaustive checking for the patterns that follow will be suppressed. So compiler will not worrying about exhaustive cases.

	def describe(e: Expr): String = (e: @unchecked) match {
	  case Number(_) => "a number"
	  case Var(_) => "a variable"
	  // case BinOp(...				 // known from context, that never be available
	}  
Match is an expression in Scala, i.e., it always results in a value.
Alternative expressions never “fall through” into the next case.
If none of the patterns match, an exception named MatchError is thrown.

Depattering

There are three places where pattern matching might happen: val, case and for. case was described above.

The patterns in for expressions can be used to extract values from an object providing map/flatMap/filter/withFilter/foreach functions.


  val exp = new BinOp("*", Number(5), Number(1))
  val BinOp(op, left, right) = exp               // extract from val, throws exception if not succeeded

  // filters for pattern, but pattern cannot be "identifier: Type", though that can be replaced by "id1 @ (id2: Type)"
  for (pattern <- object providing map/flatMap/filter/withFilter/foreach) ...
	  

Case sequences as partial functions

A sequence of case can be used everywhere a function literal can be used. Essentially, a case sequence is a more general function literal. Instead of having a single entry point and list of parameters, a case sequence has multiple entry points, each with their own list of parameters. Each case is an entry point to the function, and the parameters are specified with the pattern. The body of each entry point is the right-hand side of the case.

	val withDefault: Option[Int] => Int = {
	  case Some(x) => x
	  case None => 0
	}	  

Partial functions

A sequence of cases gives a partial function of type PartialFunction[A, R]. If you apply such a function on a value it does not support, it will generate a run-time exception (scala.MatchError).

For example, a partial function that returns the second element of a list of integers:


	val second: List[Int] => Int = {
	  case x :: y :: _ => y
	}	
To go away from compiler warnings you need to declare that you know you are working with them, by setting proper type.
(A1, A2,...,An, A) is a function from A1 * A2 * ... An to A.
PartialFunction[A1, A2, ..., An, A] is a partial function from A1 * A2 * ... An to A.

	val second: PartialFunction[List[Int],Int] = {
	  case x :: y :: _ => y
	}		
Package PartialFunction contains couple of interesting functions which takes partial functions as an argument, to be used for matching in functional style:

  import PartialFunction._

  cond("abc") { case "def" => true }                       // result: false
  condOpt("abc") { case x if x.length == 3 => x + x }      // result: Option[java.lang.String] = Some("abcabc")
  condOpt("abc") { case x if x.length == 4 => x + x }      // result: Option[java.lang.String] = None
        
Checking if a function is defined at particular value
They has a method isDefinedAt to check this.

	second.isDefinedAt(List(5,6,7))  // returns true
	second.isDefinedAt(List())       // returns false  
Other interesting method is lift, which will turn a PartialFunction[T, R] into a Function[T, Option[R]], which means non-matching values will result in None instead of throwing an exception.

Extractors

In Scala, patterns can be defined independently of case classes.
Extractor is a method of which is called to see if that case can match the input.

Extractors works with match, assignment and for comprehensions expression.

Scala defines to type of extractor methods: In the match phrase case C(...), if C has an unapply then it is called with match object. The return type of unapply depends on the case phrase, and should be chosen as follows: unapplySeq is used instead unapply to match variable length parameters, with first on specified: If the extractor returns None then the match is not succeed.

More about extractors on daily-scala


  case class Food(food:String)
  case class Name(name:String)

  object Eats {
    def unapply(desc:String):Option[(Name,Food)] = {
      val i=desc.indexOf(" eats ")
      if (i> -1)
        Some((Name(desc.substring(0,i)), Food(desc.substring(i+6))))
      else None
    }
  }

  val x= "Brutus eats meat" match { case Eats(f,n) => (f,n) }   // x=(Name(Brutus),Food(meat))
  val Eats(f,n) = "Brutus eats meat"                            // f=Name(Brutus), n=Food(meat)
		
For comprehensions with extractors

  val eats_l = List("A eats B", "B ate C", "C not D", "E eats F")
  for (Eats(f, n) <- eats_l) yield f                   // returns only List(Name("A"), Food("B"),
                                                       //    other strings don't match to Eats(f,n)
		

Extractors versus case classes

TODO

Understanding type inference algorithm

Let's take an example code:

  def msort[T](comp: (a1: T, a2:T)=>Boolean)(List[T]) =
      ...  // the body of sort method

  // List has build in method sortWith
  // using the method
  1. val l = List(4,2,3)
  2. l sortWith (_ > _)                        // OK 
  3. msort(_ > _)(l)                           // ERROR 
  4. msort[Int](_ > _)(l)                      // OK 
  5. msort((a1: Int, a2: Int)=> a1 > a2)(l)    // OK 
	
The problem is because virtual machine needs to known the method type to instanced it. In 2 VM knows how to instanced method sortWith because it is a part of the object, and don't has any other generic type (besides the object one)
In 4 VM knows the concrete type of msort, because we explicitly set the parametric type.
In 5 VM also knows the parametric type, VM can watch to its arguments to guess it. However in 3 VM can't look at the parameter l because msort is curried function and it is instanced step by step. So there are two calls, and the previous call needs to be instanced separately.
If we rewrite msort so that its parameters are swapped the code would run without error.

This inference scheme suggests the following library design principle: When designing a polymorphic method that takes some non-function arguments and a function argument, place the function argument last in a curried parameter list by its own. That way, the method's correct instance type can be inferred from the non-function arguments.

Example:

  val xss : List[A] = ...
  (xss :\ List[B]()) ( op )     // fold right operation
	
Type of op is (A, B) => B
Here we must explicit set concrete type of List, because type of op is not only related to xss, and VM needs to instantiate properly method List[A].:\ : (List[X])( (A,X) => X)
Note about limitation for type inference
The Interoperability with Java requires Scala’s type system compatibility with Java one. In particular, this means that Scala needs to support subtyping and (name-) overloaded definitions. This makes type inference difficult.

Generics - type parametrization

Constructing

Generic type is a type with type parameters. We also call generic trait/class as a type constructor, or parametrized class.

We write type parameters just after type name in squared braces.


  def foo[T1,T2](arg1: T1, arg2: List[T2]) = ...
  class Foo[T1,T2](arg1: T1) { ...

  // specifying
  val specified_foo = foo[Int, String]
			
To specify the parametrized generic type we set the type parameters in squared braces. But when we use generic type, then in most cases compiler can infer the type parameters for us, so we don't need to specify it,
eg: foo(1, "1"::Nil)

Using operator syntax

Generic type, which has two parameters can be specified using infix operator syntax. So two definitions below are equal:

  val p1:  String Pair Int  = ("1", 1)
  val p2: Pair[String, Int] = ("1", 1)
				
But this syntax is quiet odd for most type names.

The situation is different if type name look like operator.
So it can be useful when we create some type alias:


  type ##[A,B] = Pair[A,B]
  val p3:  String##Int  = ("1", 1)
				
If type is parametrized we can't make an object without parameters:

		  def f(q: Queue)    // Error: we need to pass a parameters to Queue
		

Generics relations

General principle in type system design

It is safe to assume that a type T is a subtype of a type U if you can substitute a value of type T wherever a value of type U is required. This is called the Liskov Substitution Principle.

Inheritance relations

A <: B means that A is subtype of B

Variance annotations

By default generic types are nonvariant. So if a value/function requires type A[T] it needs to get exactly type A[T]

Covariant type

Type A[] is covariant <=> for each type P1, P2 if P1 <: P2 then A[P1] <: A[P2]
We mark covariant type: A[+P]

Contravariant type

Type A[] is contravariant <=> for each type P1, P2 if P1 <: P2 then A[P1] >: A[P2]
We mark covariant type: A[-P]

This is very important as it explains why covariance can cause some issues. Contravariance is literally the opposite of covariance: parameters vary upward with subtyping. It is a lot less common partially because it is so counter-intuitive, though it does have one very important application: functions.

Example 1

	trait Output[-T] {
	  def write(x: T)
	}
		
Let's have two outputs: of Seq and List.
Output is defined to be contravariant so: Output[Seq] <: Output[List]
The only supported operation by Output[List] is writing List to it. The same operation can be done with Output[Seq]. So it's safe to use Output[Seq] in place Output[List]. On the other hand if function expects Output[Seq] but gets Output[List] it can't perform write(some_seq). (because write expects List).
Example 2

	trait Function1[-P, +R] {
	  def apply(p: P): R
	}
		
This declaration as a whole means that Function1 is contravariant in P and covariant in R. Thus, we can derive the following axioms:

	T1' <: T1
	T2 <: T2'
	---------------------------------------- S-Fun
	Function1[T1, T2] <: Function1[T1', T2']
		
Example:
We have a functions:

	def f: Seq => String
	def g: List => AnyRef
	def F: (List => AnyRef) => AnyRef
	def F2: (Seq => String) => String
		
So: f <: g and we can pass f and g to F, but only f fits to F2.

What would happen if we pass g to F2 ?
F2 would call g(some_seq), where g expects List and performs List specific operation on its argument - we got an error!. As an exercise one can consider return type.

Sound of covariance

Scala's type system ensures that variance annotations are sound by keeping track of the positions where a type parameter is used. These positions are classified as covariant for the types of immutable fields, method results and lower bounds type parameter. Positions are classified as contravariant for method argument types and upper type parameter bounds. Type arguments to a non-variant type parameter are always in non-variant position.
The position flips between contra- and co-variant inside a type argument that corresponds to a contravariant parameter. The type system enforces that covariant (respectively, contravariant) type parameters are only used in covariant (contravariant) positions.

Types of mutable fields are classified as non-variant since they has corresponding setter method (which argument is the contravariant type) and and getter method (which return value is covariant type)

private[this] variables (vars and vals) are do not affect variance and don't cause problems. The intuitive explanation is that, in order to construct a case where variance would lead to type errors, you need to have a reference to a containing object that has a statically weaker type than the type the object was defined with. For accesses to object private values this is impossible.


Example:

  class N[T]
  class C[+T]
  class Cr[+T]
  def f1(a: N[AnyRef])
  def f2(a: C[AnyRef])          // f2 accepts covariant parameters
  def f3(a: Cr[Null])           // f3 accepts contravariant parameters

  f1(new N[String])             // Error
  f2(new C[String])             // OK
  f3(new Cr[String])            // OK

  //hypothetical code - which don't run through covariance type violates
  class Cell[T](init: T) {
    var current:T = init             // error: covariant type T occurs in contravariant position
  }

  val c1 = new Cell[String]("abc")
  val c2: Cell[Any] = c1
  c2.current = 1                     // so far so good
  val s: String = c1.current         // oops! Type correct because c1 is type Cell[String] 
	  

Escaping covariance position

Sometimes we want to use covariance types in 'other'-variant position: we want to store together values of type T and derived from T.

   abstract class GenList[+T] { ...
     def prepend(x: T): GenList[T] =      // illegal! T in contravariant position
       new Cons(x, this)
   }
	  
With your new-found knowledge of co- and contravariance, you should be able to see why the this example will not compile - look at the covariance of class fields, and contravariance of methods arguments.
The problem is that A is covariant, while the prepend function expects its type parameter to be contravariant. Thus, A is varying the wrong direction. Interestingly enough, we could solve this problem by making GenList contravariant in A, but then the return type List[A] would be invalid as the prepend function expects its return type to be covariant.
Our only two options here are to a) make A invariant, losing the nice, intuitive sub-typing properties of covariance, or b) add a local type parameter to the prepend method which defines A as a lower bound:
Lower bound

	abstract class GenList[+T] { ...
	  def prepend[S>:T](x: S): GenList[S] =   // now is OK :) 
	    new Cons(x, this)
	}
			
As an example, suppose there is a class Fruit with two subclasses, Apple and Orange. With the new definition of class GenList, it is possible to prepend an Orange to a GenList[Apple]. The result will be a GenList[Fruit].

Upper bound

When projecting a function, which sort lists you can find two solutions:

Abstract members

Tour of abstract members


   trait Abstract {
     type T                        // abstract type
     def transform(x: T): T
     val initial: T
     var current: T
   }
	   
Abstract val
Scala can has abstract val (so we can overwrite object fields!)
An abstract val declaration resembles an abstract parameterless getter method declaration. Client is guaranteed that abstract val will yield the same value every time it is referenced. In contrast an abstract method, that guarantee would not hold, because in that case a concrete method could be implemented that returns a different value every time it’s called.

Any implementation of val must be a val definition (not var, def..).

Abstract vals sometimes play a role analogous to superclass parameters. This is particularly important for traits, because traits don’t have a constructor to which you could pass parameters. So parametrizing a trait works via abstract vals that are implemented in subclasses.

CAUTION! A class parameter argument is evaluated before it is passed to the class constructor (unless the parameter is by-name). An implementing val definition in a subclass, by contrast, is evaluated only after the superclass has been initialized. So the values depending of the definition of abstract val should be initialized also in subclass, or using pre-initialized fields or lazy val.
Abstract var
If you declare an abstract var you implicitly declare an abstract getter and setter method. There is no re-assignable field to be defined—that will come in subclasses that define the concrete implementation of the abstract var.

Concrete implementation of var can be val, var or pair of corresponding def getter/setter methods.

Traits
Traits are by definition abstract. We can parametrize tratis through abstract fields - see abstract vals.

Traits can be instantiated by anonymous class that mixes in the trait.
To instantiate a trait, you need to implement the abstract definitions. Here is an example:


	 trait T{
	   val arg: Int
	   val t = 2* arg
	 }
	 var x = new T { val arg=expr}      // instantiation of trait T.
	                                    // CAUTION!!!   x.t has inconsistent value
		 
However there is subtle difference between class and trait initialization. The expressions which defines abstract members are evaluated as part of the initialization of the anonymous class, but the anonymous class is initialized after the trait. So the concrete values are not available during initialization of the trait - instead the selection of a them would yield the default value (like 0, "", null)
Since that in previous example, when expr=2 then x.t==0 which could be quiet erogenous.

Abstract fields initialization

As mentioned before, there is a problem with implementing val definition in subclass, which is evaluated only after the superclass has been initialized, and fields in superclass depends on that abstract val.

Pre-initialized fields

Pre-initialized fields, lets you initialize a field of a subclass before the superclass is called. Because of that pre-initialized fields initializers cannot refer to the object that is being constructed (in pre-initializer: val pre=this.x this doesn't denote to object being constructed). Consequently, if such an initializer refers to this, the reference goes to the object containing the class or object that is being constructed, not the constructed object itself.

To make pre-initialized fields simply place the anonymous class definition in braces before the superclass constructor call.
Pre-initialized fields can be used in traits, objects or named subclasses.


trait T {
  val arg :Int
  val t=2*arg
}

val x1= new T { val arg=2}                    // constructs anonymous class with body "val arg=2"
val x2= new { val arg=2} with T               // inherits from anonymous class which body is "val arg=2" which act the PRE-INITIALIZED FIELD
object x3  extends T { val arg=2}
object x4 extends { val arg=2} with T         // object inherits from anonymous class which body is "val arg=2"
class X5 (x: Int) extends T { val arg=x }
val x5 = new X5(2)
class X6 (val arg: Int) extends T
val x6 = new X6(2)
class X7 extends T { val arg=2 }              // here we implicitly inherit from AnyRef
val x7 = new X7
class X8 extends { val arg=2 } with T         // this is general rule. We inherit from anonymous class
val x8 = new X8

x1.t        //  0
x2.t        //  4
x3.t        //  0
x4.t        //  4
x5.t        //  0
x6.t        //  4
x7.t        //  0
x8.t        //  4
		

Lazy val

If you prefix a val definition with a lazy modifier, the initializing expression on the right-hand side will only be evaluated the first time the val is used.
This is similar to the situation where val is defined as a parameterless method, using a def. However, unlike a def a lazy val is never evaluated more than once.
So you can achieve the same what you get with pre-initialized fields but in more clean way.

	  trait T {
	    val arg :Int
	    lazy val t=arg*g
	    lazy val t2=arg/g
	    lazy val g = {                    // It is initialized before the initialization of t and t2 is completed
	      require(arg>0)
		  1000/g
	    }
	  }

	  val x= new T { val arg=2}           // now x1.t yields correct value == 4
		
As we see the initialization order doesn't matter as far it doesn't produce side effects nor depends on them. g is initialized before t and t2 because they need it when initializing.
Objects as lazy vals
Objects themselves behave like lazy vals, in that they are also initialized on demand, the first time they are used. This is correct. An object definition can be seen as a shorthand for the definition of a lazy val with an anonymous class that describes the objects contents.

  object X {
    println("hej");
  }
  // as far we don't get the message "hej"
  X       // this is the time the message "hej" appears
		  

Abstract types

Why we need them

Consider the code.

	class Food
	abstract class Animal {
	  def eat(food: Food)
	}
	class Grass extends Food
	class Cow extends Animal {
	  override def eat(food: Grass) {}    // This won’t compile
	}
		
We've got: error: class Cow needs to be abstract, since method eat in class Animal of type (Food)Unit is not defined
error: method eat overrides nothing...
What happened is that the eat method in class Cow does not override the eat method in class Animal, because its parameter type is different - it’s Grass in class Cow vs. Food in class Animal.

This behavior is justified. To see this consider the case where the previous example would be type correct. Then if we got other class Fish <: Food we could call:


	val bessy: Animal = new Cow
	bessy eat (new Fish)                // disappointment - you could feed fish to cows.
	  
Get out
What if we need the abstract method with an argument type specified in a implementation classes?

  class Food
  abstract class Animal {
    type SuitableFood <: Food
    def eat(food: SuitableFood)
  }
  class Grass extends Food
  class Cow extends Animal {
    type SuitableFood = Grass
    override def eat(food: Grass) {}
  }
	  

Refinement type

In Scala we can refine types by supplying a base type a number of members inside curly braces. Giving previous example:

   val x = new Animal { override def eat(food: SuitableFood) {} }
			
The x type will be Animal

Structural subtyping

We use type refinement in structural subtyping. The difference is that structural type can have additional members (not only refinements one). The syntax for structural type is the same as for refinement type.

We can describe objective of structural subtyping as follows:
Suppose you want collect all animals which eat grass in a list. There would be to simply solutions:

  • Making trait GrassEaters and mix it in every Animal class which SuitableFood is >: Grass.
    val List[SuitableFood] = List(...)
    The downside is verbosity - you need specify SuitableFood and add dry SuitableFood trait.
  • using type refinement:
    val List[Animal { type SuitableFood = Grass} ] = List(...)
    This is much cleaner solution. You don't need to remember to add GrassEaters trait, and in client's code you don't need to know about any artificial trait. Furthermore when client would like a MeatEaters, he wouldn't need to change library code.
Other example:

We want to implement loan pattern - a function that takes object, make some operation using this object, and clean up. We need to ensure somehow that object has some method to make clean up:


   def using[T <: { def close(): Unit }, S]
            (obj: T) (operation: T => S) = {                       // curled function
     val result = operation(obj)                                   // performing operation
     obj.close()                                                   // cleaning up
     result                                                        // return the operation result
   }

   //use case:
   using(new PrintWriter("date.txt")) { writer =>
     writer.println(new Date)
   }
				
Remark. If no base type is specified, Scala uses AnyRef automatically.
So here type T is structural subtype of AnyRef

Compound Types

Suppose we want to have a method accepting argument which mixin two traits: Tr1 and Tr2.
To achieve this we can use type refinement

  trait Tr1 { def method_tr1 ... }
  trait Tr2 { def method_tr2 ... }

  def makeTr1_Tr2(arg: Tr1 with Tr2) {
    arg.method_tr1()
    arg.method_tr2()
  }
				
In similar way we can also create variable with type "on the fly" supporting Tr1 and Tr2:

  var x = new SomeClass() extends Tr1 with Tr2
				

Path-dependent types

As wee see types can be members in Scala. We call such member path-dependent types and the qualified name of such type is a path to the member followed by the name of the member.
So the full properly name of SuitableFood is: mypackage.bessy.SuitableFood Lets c.T is an instance of a path-dependent type. In general, such a type has the form x1 . . . . .xn .t, where n > 0, x1 , . . . , xn denote immutable values and t is a type member of xn . Path-dependent types are a novel concept of Scala.

Differences with Java inner classes

A path-dependent type resembles the syntax for an inner class type in Java, but there is a crucial difference: a path-dependent type names an outer object, whereas an inner class type names an outer class. Java-style inner class types can also be expressed in Scala, but they are written differently.

	class Outer {
	  class Inner
	}
	// Java access:
	Outer.Inner
	// SCALA access:
	Outer#Inner        // The ‘.’ syntax is reserved for objects only.

	val o1 = new Outer
	val o2 = new Outer
		  
o1.Inner, o2.Inner are two different path-dependent types.
Outer#Inner is a general type, which represents the Inner class with an arbitrary outer object of type Outer.
By contrast, type o1.Inner refers to the Inner class with a specific outer object (the one referenced from o1).
Instantiating inner classes
In Scala, as in Java, inner class instances hold a reference to an enclosing outer class instance. This allows an inner class to access members of its outer class. Thus you can’t instantiate an inner class without specifying an outer class instance.
One way to do this is to instantiate the inner class inside the body of the outer class. In this case, the current outer class instance (referenced from this) will be used.
Another way is to use a path-dependent type. For example, because the type, o1.Inner, names a specific outer object, you can instantiate it: new o1.Inner . The resulting inner object will contain a reference to its outer object, the object referenced from o1.
By contrast, because the type Outer#Inner does not name any specific instance of Outer, you can’t create an instance of it: new Outer#Inner // Error

Path prefix immutability

Path-dependent types rely on the immutability of the prefix path. Here is an example where this immutability is violated.

	abstract class AbsCell {
	  type T
	  val init: T
	  private var value: T = init
	  def get: T = value
	  def set(x: T): unit = { value = x }
	}

	var flip = false
	def f(): AbsCell = {
	  flip = !flip
	  if (flip) new AbsCell { type T = int; val init = 1 }
	  else new AbsCell { type T = String; val init = "" }
	}
	f().set(f().get)                      // illegal!
		
f() return cells where the value type is alternating Int and String.
The last statement in the code above is erroneous since it tries to set an int cell to a String value. The type system does not admit this statement, because the computed type of f().get would be f().T. This type is not well-formed, since the method call f() is not a path (doesn't denote immutable value).

Lambda types

Suppose we have a generic type parametrized by other generic type:
trait Functor[F[_]] - some container type supporting fmap operation.

How we could create Functor type parametrized by type having three type parameters, eg Function2[A, B, R]?

The solution is to bind all by one type parameters in a structural type:


  implicit def Function2Functor[A, B] = new Functor[({type λ[R]=(A, B) => R})#λ] {
      // definition of abstract method
  }
			
Here we create implicit function which gives us implicit object of Functor[(A, B) => R].
We bind A and B types in structural type {type λ[R]= (A,B) => R} which has one field type parametrized by one type parameter.
Then we extract the field type to get what we want.
The syntax presented above is called Lambda type.

We couldn't write new Functor[(A, B =>R)] because type (A, B)=>R is not type constructor - it doesn't take any type parameter. The type is already constructed by Function2[_,_,_]

Function2[_,_,_] is generic type, but it takes 3 parameters. We expect that Functor takes type which is parametrized only by one type (one parameter type constructor).

Enumerations

Scala don't need build in language constructs for enumerations. Instead it uses language features to get them.

To create a new enumeration, you define an object that extends scala.Enumeration class, as in the following example:


  object Color extends Enumeration {
    val Red, Green, Blue = Value
  }

  object Direction extends Enumeration {
    val North = Value("North")           // overloaded Value function with takes `name` argument
  }
	  
Enumeration defines an inner class named Value, and the same-named parameterless Value method returns a fresh instance of that class. This means that a value such as Color.Red is of type Color.Value which is a path-dependent type, with Color being the path and Value being the dependent type.

Values of an enumeration are numbered from 0, and you can find them out by its id method:
Direction.North.id

It’s also possible to go the other way, from a non-negative integer number to the value that has this number as id in an enumeration:
Direction(0)

Implicits

Implicit functions

We define implicit function by prefix function definition with implicit keyword, eg:

	implicit def intToX(i: Int) = new X(i)
		  
When such function is directly in scope (accessed without any preceding identifier), we can implicit convert variable Int to X (for example when function expects X, but we pass Int)

Implicit conversions

Implicit conversions are made by implicit functions.
Implicit definitions are made by prepending implicit keyword before normal definition.

Implicit definitions are used by compiler to insert into a program in order to fix any of its type errors.
For example, if x + y does not type check, then the compiler might change it to convert(x) + y, where convert is some available implicit function. If convert changes x into something that has a + method, then this change might fix a program so that it type checks and runs correctly. If convert really is just a simple conversion function, then leaving it out of the source code can be a clarification.

When Compiler can use implicit conversions
When more conversions are available
According to Only one available Rule compiler will allow to exist more then one conversion available (in the same scope) only when one of them is strictly more specific then others.
One implicit conversion is more specific than another if one of the following applies:
  • The argument type of the former is a subtype of the latter’s.
  • Both conversions are methods, and the enclosing class of the former extends the enclosing class of the latter.
So if there are two functions: f1: List[Int] => String; f2: Seq[Int] => String and compiler expects String, but gets List[Int], then it will choose f1 function.
The consequence of the second point is that in "abc".reverse compiler will choose conversion from String to StringOps <: SeqLike[Char] instead of WrappedString <: SeqLike[Char] (the old one from Scala 2.7) because the former is declared in Scala.Predef, the letter one in Scala.LowPriorityImplicits and Predef <: LowPriorityImplicits
Where implicit are tried
Scala.Predef contains numerous of helpful implicit conversions.
How to check which implicit were chosen
when compiling type:
scala(c) -Xprint:typer
It will show you what your code looks like after all implicit conversions have been added by the type checker.

Converting the receiver

This kind of implicit conversion has two main uses. To see how it works, suppose you write down obj.doIt, and obj does not have a member named doIt. The compiler will try to insert conversions before giving up. In this case, the conversion needs to apply to the receiver, obj. The compiler will act as if the expected "type" of obj were "has a member named doIt." This "has a doIt" type is not a normal Scala type, but it is there conceptually and is for compiler to insert an implicit conversion in this case.

Interoperating with new types


	class Rational(n: Int, d: Int) {
	  ...
	  def + (that: Rational): Rational = ...
	  def + (that: Int): Rational = ...
	}
	// so we can add either two rational numbers or rational and int
	val r = new Rational(1,2)
	1 + r                        // error, Int doesn't have method +(x: Rational)
	implicit def intToRational(x: Int) = new Rational(x, 1)
	1 + r                        // now we can add int to rational
			

Simulating new syntax

-> in 1 -> "one" is not a syntax! Instead, -> is a method of the class ArrowAssoc, a class defined inside the standard Scala. It also contains implicit conversion from Any type to ArrowAssoc.
Here the compiler inserts the implicit conversions form 1 to ArrowAssoc

Implicit parameters

Implicit parameter is variable, or an object which is missing, within function arguments list - so compiler can replace someCall(a) with someCall(a)(b) or new SomeClass(a) with new SomeClass(a)(b).

Implicit parameter can be the entire last curried parameter list that’s supplied, not just the last parameter.

To let the compiler supply the parameter implicitly, you must first define a variable of the expected type, which is marked with implicit keyword.

implicit keyword applies to an entire parameter list, not to individual parameters.


  class PreferredPrompt(val preference: String)
  object Greeter {
    def greet(name: String)(implicit prompt: PreferredPrompt, drink: String) {
      println("Welcome, "+ name +". The system is ready.")
  		println("why not enjoy a cup of "+ drink +"?")
      println(prompt.preference)
    }
  }

  implicit val bobsPrompt = new PreferredPrompt("relax> ")        // it must be marked implicit if compiler might use it
  //implicit object BobsPrompt extends PreferredPrompt("relax> ") // Other way to achieve the same as the previous statement
  implicit val bobsDrink = "Coca-Cola"
  Greeter.greet("Bob")(bobsPrompt)                                // error: not enough arguments...
  Greeter.greet("Bob")                                            // compiler convert it to Greeter.greet("Bob")(bobsPrompt, bobsDrink)
			
You need to be careful which variables you make implicit. It is wise to not select popular types for implicit, instead choose some rare types (as PreferredPrompt). As a result, it is unlikely that implicit variables of these types will be in scope if they are not intended to be used as implicit parameters.

Type bound

Firstly check some interesting implicit related functions:

View bound

Consider function maxListElem example. In the body of maxListElem we don't use explicitly ordered function. So the code will be proper if we change the argument name from ordered to anything else.
Because this pattern is common, Scala lets you leave out the name of this parameter and shorten the method header by using a view bound.

View bound are made by putting <% in type parameters declaration:


  def maxList[T <% Ordered[T]](elements: List[T]): T = ...

  // which compiles to:
  def maxList[T](elements: List[T])(implicit ev: T=>Ordered[T]): T = ...
			  
T <% Ordered[T] means "I can use T, as long as T can be treated as an Ordered[T]."
For example, class Int is not a subtype of Ordered[Int], but you can pass a List[Int] to maxList because an implicit conversion from Int to Ordered[Int] is available.
Moreover, if type T happens to already be an Ordered[T], you can still pass a List[T] to maxList. The compiler will use an implicit identity[T] function, declared in Predef.

What are applications for view bounds

pimp my library pattern - a way to "add" methods to existing classes, and return original type.
Eg: we want to use rich functionality from Scala string - StringOps but working on original String.
Eg2:

 def f[A <% Ordered[A]](xs: A*): Seq[A] = xs.toSeq.sorted

// even if the type is only used as a type parameter of the return type
 def f[A <: Ordered[A]](xs: A*): Seq[A] = xs.toSeq.sorted   // oops, not every type supported 
 def f[A](xs: Ordered[A]*): Seq[A] = xs.toSeq               // implicit conversion to an expected type occurs,
                                                            // return type is Seq[Ordered[A]]
				
This example won't work without view bounds. However, if I were to return another type, then I don't need a view bound any more:

	def f[A](a: Ordered[A], b: A): Boolean = a < b
				
Eg3. Handling String and Array, which are Java classes, like they were Scala collections:

 def handle_collection[CC <% Traversable[_]](a: CC, b: CC): CC =
   if (a.size < b.size) a else b
				
If one tried to make handle_collection without view bounds, the return type of a String would be a WrappedString (Scala 2.8), and similarly for Array.

Context bound

We write context bound:

  def context_fun[T : P](a: T) = ...

 // which compiles to:
  def context_fun[T : P](a: T)(implicit v: P[T]) = ...
			  
Context are used to make assure that there exists some implicit value of parametrized type P[T]
We can say: "For value for type T ensure existence of implicit value `x` in context P. x type is P[T]".

The common example of usage in Scala is this:


  def f_context_ord[A : Ordering](a: A, b: A) = implicitly[Ordering[A]].compare(a, b)
			  
f_context_ord require some implicit Ordering[A] class instance to compare a and b.

Context Bounds generalize View Bounds.


  def f1[T <% String](t: T) = 0

 // equivalent with context bound
  trait To[T] { type From[F] = F => T }
  def f2[T : To[String]#From](t: T) = 0
				
A context bound must be used with a type constructor of kind * => * (a `function` to get implicit value gets type T and results implicit value of context type C[T]). However the type constructor for Function1 is of kind (*, *) => * (type constructor gets two types: function argument type, and function result type; and results implicit function). The use of the type alias partially applies second type parameter with the type String, yielding a type constructor of the correct kind for use as a context bound.

There is a proposal to allow you to directly express partially applied types in Scala, without the use of the type alias inside a trait. You could then write:


  def f3[T : [X](X => String)](t: T) = 0
				

Constructing Arrays

An Array initialization on a parametrized type requires a ClassManifest[A] to be available, for arcane reasons related to type erasure and the non-erasure nature of arrays.

  def f1[A](n: Int) = new Array[A](n)                   // error: cannot find class manifest for element type T
  def f1[A : ClassManifest](n: Int) = new Array[A](n)
				
Examples:

  def **[T : Numeric](xs: Iterable[T], ys: Iterable[T]) =
        xs zip ys map { t => implicitly[Numeric[T]].times(t._1, t._2) }   // We get implicit context value through implicitly function
  def **[T : Numeric](xs: Iterable[T], ys: Iterable[T]) =
       xs zip ys map { t => context[T]().times(t._1, t._2) }              // the same as above, but we use context function
			  

What are applications for context bounds

typeclass pattern - implements an alternative to inheritance by grouping many different types that share a common set of methods, just like a superclass can be used to group its subclasses.
Grouping is make through a sort of implicit adapter pattern. More about using typeclass pattern

The classic example is Scala 2.8's Ordering, which replaced Ordered throughout Scala's library and take advantage of some implicit conversions inside Ordering that enable the traditional operator style. Another example in Scala 2.8 is the Numeric:


  def f[A](a: A, b: A)(implicit ord: Ordering[A]) = {
    import ord._                                        // import members from implicit value ord
    if (a < b) a else b                                 // you can call explicitly ord.lt(a,b)
  }

  def f[A : Numeric](a: A, b: A) = implicitly[Numeric[A]].plus(a, b)
				
A more complex example is the new collection usage of CanBuildFrom. And, as mentioned before, there's the ClassManifest usage, which is required to initialize new arrays without concrete types.

The context bound with the typeclass pattern is much more likely to be used by your own classes, as they enable separation of concerns, whereas view bounds can be avoided in your own code by good design (it is used mostly to get around someone else design).

Though it has been possible for a long time, the use of context bounds has really taken off in 2010, and is now found to some degree in most of Scala's most important libraries and frameworks. The most extreme example of its usage, though, is the Scalaz library, which brings a lot of the power of Haskell to Scala. I recommend reading up on typeclass patterns to get more acquainted it all the ways in which it can be used.


The manual data for implicit are made with some StackOverflow answers help:

Common patterns with implicits

Function composition

Function composition is not so flexible as in Haskell since Scala functions are not curried by default. So mostly we need to round about partially applied functions.

Scala functions (class FunctionX) has a 2 methods andThen, compose which performs composition. f andThen g = g(f), f compose g = f(g)

Using Scalaz
Good solution is to use Scalaz &&& operator.

Providing more information

Consider the function which finds the maximum element in the given list:

	def maxListElem[T <: Ordered[T]](elements: List[T]): T =
	  ...
	}
			
Everything would go fine, but many popular buildin types are not subtype of Ordered[] (like Int, String...).
Many buildin types has implicit methods which converts them to popular traits, like Ordered.
The solution would be to add additional information to type T - how to convert it to type Ordered[T]

  def maxListElem[T](elements: List[T])
        (implicit orderer: T => Ordered[T]): T =
    elements match {
      case List() =>
  	  throw new IllegalArgumentException("empty list!")
      case List(x) => x
      case x :: rest =>
        val maxRest = maxListElem(rest)(orderer)        // implicit put (ordered) - maxListElem(rest)(ordered)
        if (x > maxRest) x                              // implicit call ordered(x) when type x don't have > method
        else maxRest
  }

  implicit def myTypeToOrdered(x: MyType): = new Ordered[MyType] {
    def compare(that: MyType) = x.some_field - that.some_field;
  }
			
Now we can use function maxListElem with List[MyType]

Trait replacement

To ensure that some class has some functionality we can use below solution then mixing Traits:

  // with view -- implicit function
  abstract class MyType (implicit cmp : MyType => Ordered[MyType]) {
    ...
  }
  implicit def MyTypeToOrdered(x: MyType) = new MyType Ordered[MyType] {
    def compare ...
  }
  // or with context bound:
  abstract class MyType (implicit ev : Ordering[MyType]) {
    ...
  }
  implicit val MyTypeToOrdering = new Ordering[MyType]{
	  def compare(a: MyType, b: MyType) = ...
  }

  // Instead of: 
  abstract class MyType with Ordered[K] {


  // other use case for class parameter
  abstract class RedBlackTree[K, V] (implicit cmp : K => Ordered[K]) ...   // equally [K <% Ordered, V], but we can use directly cmp method 
			  
The benefits of such solution are described in other parts of this subsection:

Conditional type extend

Consider example:
We have an abstract class with an type field.
We want to compare those subclasses which has the same value of type member.
This is problematic to make this on type level with traits, without throwing exception when objects doesn't have the same type field.
But implicit values makes here a good job.

  abstract class A{
    type B
  }
  // We want to make compare instance of classes C1 < A and C2 < A only when C1#B = C2#B

  type AA[T] = A { type B = T }           // type alias for A in generic form

  implicit def aIsOrdered[T](a : AA[T]) = new Ordered[AA[T]] {
    def compare(that : AA[T]) = ...
  }
				
Now we can put the list of A subtypes, which have the same type B field, to some requires Ordered view, as well Ordering context.

Rich methods as implicit functions

Let's take another example:
We want to make some DSL for regular expressions. There will base type to represent RegExp family, and implicit function for conversion from String to Str type - which is simple regexp matching exactly this string.

  abstract class RegExp {
    def nullable: Boolean
    def derive(c: Char): RegExp
		def match(s: String) =
      if (s.isEmpty) nullable
      else derive(s.head).match(s.tail)
  }

  case object Empty extends RegExp {
    def nullable = false
    def derive(c: Char) = Empty
  }

  case object Eps extends RegExp {
    def nullable = true
    def derive(c: Char) = Empty
  }

  case class Str(s: String) extends RegExp {
    def nullable = s.isEmpty
    def derive(c: Char) =
      if (s.isEmpty || s.head != c) Empty
      else Str(s.tail)
  }

  case class Cat(r: RegExp, s: RegExp) extends RegExp {
    def nullable = r.nullable && s.nullable
    def derive(c: Char) =
      if (r.nullable) Or(Cat(r.derive(c), s), s.derive(c))
      else Cat(r.derive(c), s)
  }

  case class Star(r: RegExp) extends RegExp {
    def nullable = true
    def derive(c: Char) = Cat(r.derive(c), this)
  }

  case class Or(r: RegExp, s: RegExp) extends RegExp {
    def nullable = r.nullable || s.nullable
    def derive(c: Char) = Or(r.derive(c), s.derive(c))
  }

  case class And(r: RegExp, s: RegExp) extends RegExp {
    def nullable = r.nullable && s.nullable
    def derive(c: Char) = And(r.derive(c), s.derive(c))
  }

  // repetitions, eg "Rep("a", 4)" matches "aaaa"
  case class Rep(r: RegExp, n: Int) extend RegExp {
    def nullable = r.nullable
    def derive(c: Char) = repr.derive(c)
    def repr =
      val aux = (i:Int) =>
        if (n<=0) r
        else      Cat(r, aux(n-1))
      aux(n)
    // Other possibility to represent Repr is to move repr function to some object
  }

  case class Not(r: RegExp) extends RegExp {
    def nullable = !r.nullable
    def derive(c: Char) = Not(r.derive(c))
  }
				

We can construct regular expressions (for example to match simple string "start" or "end") using:

  • constructor composition, eg: Or(Str("start"), Str("end"))
  • operators, eg: Str("start") | Str("end")
Operators support we can achieve by:
  • entering operators in base class as a methods. I assume that this is known how to make it
  • making implicit conversion to some structural subtype which contains methods to operate on RegExp:
    
      object RegExpPimps {
        implicit def string2RegExp(s: String) = Str(s)
    
        implicit def regExpOps(r: RegExp) = new {
          def | (s: RegExp) = Or(r, s)
          def & (s: RegExp) = And(r, s)
          def % = Star(r)
          def %(n: Int) = rep(r, n)
          def ? = Or(Eps, r)
          def ! = Not(r)
          def ++ (s: RegExp) = Cat(r, s)
          def ~ (s: String) = Matcher.matches(r, s)
        }
    
        implicit def stringOps(s: String) = new {
          def | (r: RegExp) = Or(s, r)
          def | (r: String) = Or(s, r)
          def & (r: RegExp) = And(s, r)
          def & (r: String) = And(s, r)
          def % = Star(s)
          def % (n: Int) = rep(Str(s), n)
          def ? = Or(Eps, s)
          def ! = Not(s)
          def ++ (r: RegExp) = Cat(s, r)
          def ++ (r: String) = Cat(s, r)
    
          def ~ (t: String) = string2RegExp(s).match(t)
        }
    					
We can use this as following:
In the main function we construct some regular expressions, and in then we try to test them

  object Test {
    def main(args: Array[String]) {

      // we start from opening RegExpPimps object content, to get access to implicit functions
      //     and to implicit conversion from String to RegExp
      import RegExpPimps._

       // here we construct some regular expressions
      val digit = "0" | "1" | "2" | "3" | "4" | "5" | "6" | "7" | "8" | "9"
      val int = ("+" | "-").? ++ digit.%(1)
      val real = ("+" | "-").? ++ digit.%(1) ++ ("." ++ digit.%(1)).? ++ (("e" | "E") ++ ("+" | "-").? ++ digit.%(1)).?

      // Some strings to test regular expressions
      val ints = List("0", "-4534", "+049", "99")
      val reals = List("0.9", "-12.8", "+91.0", "9e12", "+9.21E-12", "-512E+01")
      val errs = List("", "-", "+", "+-1", "-+2", "2-")

      // testing
      // ~ calls match function
      ints.foreach(s => assert(int ~ s))
      reals.foreach(s => assert(!(int ~ s)))
      errs.foreach(s => assert(!(int ~ s)))

      ints.foreach(s => assert(real ~ s))
      reals.foreach(s => assert(real ~ s))
      errs.foreach(s => assert(!(real ~ s)))
  }
				
The difference between rich methods and base class support is: we have access to methods only when we import implicit functions.
So, if in some part of code we have implicit conversions to 2 or more objects supported by rich methods we can choose which one we want to use by precisely importing them.

Typeclasses - implicit objects against polymorphism with traits

Only one definition of trait problem
When mixin trait we stand on one definition of feature which trait declares.

When using typeclasses we benefit multiple feature definitions. We just import that typeclass we want, and we are ok with other feature interpretation.

We can achieve the same with traits by structural subtyping, but it doesn't looks nice when we need to mixin some other functionality and overwrite the default one in some context.

  // mixin trait way
  trait FeatureX = { def makeX : Int; }
  class A extends FeatureX = {
    def makeX = { // default makeX interpretation
    }
  }
  val a = new A
  val a2 = new A {
    def makeX = { // other makeX interpretation
    }
  }
  val a3 = new A extends SomeTraitWithFeatureX    // a3 with arbitrary makeX interpretation from SomeTraitWithFeatureX

  // typeclass way
  class A
  trait FeatureX[T] = { def makeX(t : T) : Int; }
  implicit a1_WithFeatureX = new FeatureX[A] {
    def makeX{ // first makeX interpretation
    }
  }
  implicit a2_WithFeatureX = new FeatureX[A] {
    def makeX{ // second makeX interpretation
    }
  }

  import SomeImplicits.a3_WithFeatureX      // other implicit object with arbitrary makeX for type A interpretation
			
Only one parametrized trait problem
Other problems with mixins is when we rely on some type to be useful in multiple version of some parametrized trait.
Consider the example:

  case class X(x: Int) extends Ordered[X] {
    def compare(other: X) = x - other.x
  }

  def binSearch[A <% Ordered[A]](a: Array[A], v: A){
    def recurse(low: Int, high: Int): Option[Int] = (low + high) / 2 match {
      case _ if high < low => None
      case mid if a(mid) > v => recurse(low, mid - 1)
      case mid if a(mid) < v => recurse(mid + 1, high)
      case mid => Some(mid)
    }
    recurse(0, a.size - 1)
  }

  binSearch(Array(X(1), X(2)), X(2))
  binSearch(Array(X(1), X(2)), 2)                          // Type error
			
We would like to see X as both Ordered[X], Ordered[Int]. But we don't want to see Int as X.
One could think to make additional mixin:

  case class X(x: Int) extends Ordered[X] with Ordered[Int] ...       // Illegal!
			
But this is impossible due to type erasure. After type erasure both Ordered[Int] and Ordered[X] are seen as the same parametrized class, so it makes disambiguation and compiler doesn't allow this.
One good solution is to use more elaborated implicit conversions:

  def binarySearch[B, A <% Ordered[B]](a: Array[A], v: B) =  ...
  // implicit conversion from X to Ordered[X] exists - it is identity
  // we need to make implicit conversion from X to Ordered[Int] to search for int

  implicit val compThingToInt: Thing => Ordered[Int] =
   t => new Ordered[Int] {
     def compare(that: Int): Int = { t.n - that }
   }
			
Other solution might be to use typeclasses

More information about this problem and some solution can be found here. Very interesting is the first comment.

Ordered vs Ordering

Ordered[T] is a trait, which we use to compare T values by mixin with type T.
When we want to auto-create compare-like methods with other types we need to make some trick:

  // The BAD!!! code - it hacks the type system, and can make some bad work in the future (Read above subsection)
  case class Thing(val n:Int) extends Ordered[Any] {
    def compare(that: Any): Int = that match {
      case i:Int => this.n - i
      case x:Thing => this.n - x.n
      case _ => throw new IllegalArgumentException("bad type")
    }
  }
  // Better solution is to use something from the upper subsection
  //   - the normal compare method (which takes Thing argument) and add additional implicit methods
			  
Otherwise we need to implement dozones of compare-like methods (>, >= ...).

Ordering[T] is a trait - typeclass, which we use to compare T values by creating (or importing) implicit object of type Ordering[T] with appropriate methods definitions.

Ordering has more function to work with. eg: reverse which returns reverse ordering of some type. The working with reverse requires to explicitly apply the Ordering instance to method call:


  // TreeMap requires Ordering context
  // assuming we have some Ordering[Foo]
new TreeMap[Foo, Bar]()(implicitly[Ordering[Foo]].reverse)
				

Ordered is being deprecated in favor of Ordering. Ordering is strictly more powerful because you can have several Orderings on a class Foo, whereas Foo can only implement Ordered once. (You can fake with by having Foo -not- implement Ordered, and making several implicit conversions to Ordered[Foo], and controlling the scope of those implicits, but this is kind of hack and also performs very poorly.)

Thanks to the implicit definitions in scala.math.LowPriorityOrderingImplicits Ordered types, or types which have implicit conversion to Ordered type, are sufficient to provide us with corresponding Ordering type class instances.

Narrowing argument types

This present something similar to union types construction.

Motivation: we want to make a parametrized method, which type parameter is restricted to arbitrary set of types, ie: Int and Long.
We make using typeclass Acceptable - an abstract class, which perform as a type guardian for implicit objects, which perform evidence that the type can be used for desired method.


  abstract class Acceptable[T]
  object Acceptable {
    implicit object IntOk extends Acceptable[Int]
    implicit object LongOk extends Acceptable[Long]
  }

  // Our method:
  def f[T: Acceptable](t: T) = ...

  import Acceptable._
  // now we can use f only with this argument type which are in Acceptable context.
					

Other interesting implementation using type-classes

Interesting typeclasses:
  • Numeric

Look Scala.math.Integral implementation. This class extends Numeric by two methods, and has interesting inner class which brings the feel of object methods instead of calling implicit object methods - eg:

  val x = some_val_with_Integral_support
  val y = some_val_with_Integral_support
  x % y                                     // calls Integral[Int] .quot(x, y)   thanks internal implicit conversion to IntegralOps
  implicitly[ Integral[Int] ].quot(x, y)    // normal call
			  

Generic API

The following example shows generic API design presenting the power that we get with typeclasses.
Here the generic API implementation means:
  • open - allows multiple implementation. User can chose which one wants to use (by importing particular implicit object to current scope)
  • works with abstraction that user implement later.
  • divide the code functionality to simple, small implicit object
  • allows to dispatch (choose which implicit object needs to use) a function based on return type, function argument as well as current context
  • all dispatch is static - at the end compiler insert right static code (not dynamic based on runtime)
This contrast what we could achieve with subtyping, where all subtypes are bound, in declaration time, by the contract that the super type exposes. This means subtyping are closed in contrast to open typeclasses.

The example is about implementing Scala version of Read typeclass - which have one method: read::(Read a) => String -> - function requires type a which implements class Read. The function takes string representation of type t, and returns instance of type a corresponding to that string.


  // typeclass
  trait Read[T] {
    def read(s: String): T
  }

  // implementing instances for Read
  implicit object IntRead extends Read[Int] {
    def read(s: String) = s.toInt
  }


  // Our Object
  case class Name(firs: String, last: String)

  object NameDescription {
    def unapply(s: String) = {
      val a = s.split("/")
      Some(a(0), a(1))
    }
  }

  implicit object NameRead extends Read[Name] {
    import NameDescription._
    def read(s: String) = s match {
      case NameDescription(l, f) => Name(l, f)
      case _  =>  sys.error("invalid")
    }
  }

  // we can also set up generic context for high level constructs:
  // here we define implicit "object generator" in context Read
  // compiler can make automatic make new instance of class Read for generic Seq[T] type,
  //      - only if the type parameter T implement Read typeclass
  implicit def SeqRead[T : Read] = new Read[Seq[T]] {
    def read(s: String) =
      s.split(" ").toSeq map (implicitly[Read[T]] read _)
  }

  // using
  def foo[T: Read](s: String) = implicitly[Read[T]] read s

  foo[Int]("123")                // returns 123 : Int
  foo[Name]("Robert/Zaremba")    // returns Name("Robert", "Zaremba")
  foo[List[Int]]("1 2 3")        // returns List(1,2,3)
				
Presented API becomes hugely expressive under the control of static type system. All this constraints are checked during compile time.

Polymorphism

As wee see Scala has sophisticated type system with very clever type mixins, and type implicit conversions and type bounds.
Beside this Scala offers powerful type inference which allows to omit type declaration in common places.

All of this makes Scala to achieve polymorphism in multiple way - depends on the needs.
Moreover - all of them are orthogonal semantic concept, which Scala easy and powerful tool for domain models. - after Debasish words: Great languages are those that offer orthogonality in design. Stated simply it means that the language core offers a minimal set of non-overlapping ways to compose abstractions.

Polymorphism in Scala is make with:
Inheritance with abstract class
We make abstract base class and family of classes which inherit from this.
more. The practical application for this is to construct family of classes which resolves to different construction (Animal <- Fish, AnyRef <- String )
Inheritance with case classes
more. Useful for flat class family, designed for pattern matching.
Inheritance with Trait mixins
more more. Trait serves to aggregate common functionality and to make some polymorphic functions which require this functionality.
So if function expect argument which is a Seq and has apply method we simply write the type as
Seq with Function1
If every object from some class needs to have some library functionality we just mixin appropriate trait on the class level.
Furthermore traits helps divide logic between:
  • base class - there should go only primary concept of a class (like changing variables, processing dependencies between related objects...)
  • traits - they should contain features, presenting scenario, etc...
So every concern related to specific class should be make by a trait.
Traits with abstract base class
more. They serves to build mixin functionality to specified class family. Traits with abstract base class use class members to build functionality and make dependency.
Structural subtype
more. They serves to extends particular class (usually abstract one) on the fly (for example by specifying the type variable) - look the reverse method implementation in Ordering trait.
Other use case would be to specify function argument type - for which we require to implement some set of Traits. Then we simply write the argument type as: T1 with T2 ...

Usually we use type fields when mixing traits or simply structural subtype.

implicit conversions - view bounds
more. Scala allows implicit convert one type to other (Eg: having class X, but design decision was to make it simple and small - so it don't mixin Ordered, but has implicit conversion to Ordered type).
View bound asserts existence an evidence for some conversion, functionality in current scope (by implicit functions). So the evidence function is used to make some object useful / properly in current operation. They are used to enrich some type (eg: StringOps, ArrayOps, RichInt ...)
implicit objects - typeclasses
more. They are use to make some functionality more flexible (eg: to ensure that there is some order on specified type, but in different context the order on the same set of objects that type can be different)

Manifest and type parameters instantiating

Due to type erasure we loose the type parameter in generics during class parametrization.
In some generic type class C[T] there is no way to directly instantiating type parameter in runtime using new because: To instantiate type parameter inside class we can use: ... or directly put already constructed object as an argument to constructor / method.
Using factory method
Factory method is a design pattern which take from us project constriction and instantiation. We simply call factory method, and the method itself recognize how to instantiate an object and which class to use (eg when using inheritance it sometimes might to create subclass object). More about factory method design pattern on Wikipedia pages.
You should also consider dependency injection patter.

  class BalanceActor[T <: Actor](val fac: () => T) extends Actor {
    val workers: Int = 10

    private lazy val actors = new Array[T](workers)

    override def start() = {
      for (i <- 0 until workers) {
        actors(i) = fac() //use the factory method to instantiate a T
        actors(i).start
      }
      super.start()
    }
  }

  // using BalanceActor:
  val ba = new BalanceActor[CalcActor]( { () => new CalcActor } )
  ba.start
			

Parallel computation

Snippets on parallel computation

TODO

Tips and popular structures

Interesting patterns

Really great presentation: PODCAST DESIGN PATTERNS IN SCALA

Dependency Injection

Dependency Injection is very popular pattern last time. Usually used to easily change some functionality depend of the environment it is used for.
Dependencies are component which contains some functionality. We can create more version of some component and easily substitute them in other associated components.
You can read more about DI: http://tutorials.jenkov.com/dependency-injection/index.html or on Wikipedia.

Scala is so flexible language, that you can achieve Dependency Injection in multiple way: using mixing, Cake Pattern and high-order Function.

Functional Programming guru also claim, that in FP world there is no need for special DI framework, as high-order function are enough.

Very good explanation of Scala flexibility is Martin Odersk'y paper: Scalable Component Abstractions.

Dependency Injection and Factory pattern
DI looks very similar to Factory pattern. But Software gurus actually aims to use DI. The key difference is that:
  • using a factory your code is still actually responsible for objects creation
  • by DI you outsource that responsibility to a framework / other module which is separate from your code.
The drawback of Factory pattern are described at http://tutorials.jenkov.com/dependency-injection/dependency-injection-replacing-factory-patterns.html

Cake Pattern

Cake Pattern is very clean way to make Dependency Injection, which use Scala flexibility.

The key idea is that we compound every class/functionality in abstract component which has the instance of functionality as a field (val or parameter less method which gives us the instance). This field is used by other components which depends on it and we should delay coupling to any initialization of it till the time we absolutely need them. And that is when we make the assembly with this component.

When some component (A) depends on other component (B) we use self type annotations (self: B =>) to express this. So that if we want to use component A we need to mixin component B. At the end the the functionality in class A has an access to instance of functionality B, which is specified in creation time.
To get the full functionality assembling those components using mixins and construct an object or instance of it. It is similar to putting together different layers of a cake to form the final shape.

If The Construction of the object depends on initialization order then the object of the functionality should be lazy val. Eg: Component A depends on component B and initialization of A requires some information from initialization of B.

Since every trait can be mixed only once there is a limitation about using mixins. This is described Constructor based DI section.

General template:

  // here we can make an trait to assure that every implementation of component would look that same:
  trait Functionality1 {
    val / lazy val / def functionality1 : Functionality1Impl
    abstract class / trait Functionality1Impl { ... }
  }

  trait Functionality1Component_V1 extends Functionality1 {
    self : Dependency1 with Dependency2 =>              // eg self: Functionality2 =>
    class Functionality1Impl { .. }
  }

  trait Functionality1Component_V2  extends Functionality1 {
    self : DependencyComponent1 with DependencyComponent2 =>
    val _cached_functionality1 : Functionality1Impl     // still we delay with the instantiation
    val / lazy val / def functionality1  = _cached_functionality1
    class Functionality1Impl { .. }                     // other implementation
  } 
def or var to express dependency instance?
If you use simply val, all implementations are locked and have to provide a single dependency instance (a constant). With a method, you can return different values on each invocation. For example, in a web environment, this is a great way to implement scoping! The method can read from the request or session state.
Still end components can have abstract vals, and define them when instantiating:
The whole example:

  class User(val name: String) {
    override def toString() = name
    val username = name;
  }

  /****************************
   * Functionality 1: retrieving users
   * here functionality is not surrounded by the class
   */
  trait UserRepo{
    val repo_address :String
    println("> creating UserRepo instance to \""+repo_address+"\" repository")    // this will be printed each time...
                                      //    ...when UserRepoMock is called from "def instance"
    def find(name: String): Option[User]
  }

  //***   implementations of UserRepo   ***
  class UserRepo1(val repo_address: String) extends UserRepo{
    def find (name: String) =
      if(name startsWith "r") Some(new User(name))
      else None
  }

  class UserRepoMock(val repo_address: String) extends UserRepo{
    def find (name: String) = {
      println("mock find")
      Some(new User(name))
    }
  }

  //****   wrapping into component   ****
  trait UserRepoComponent[T <: UserRepo] {
    def userRepo : T                        // here val or lazy val can be used as well
  }

  //***   examples about how to use User Repo Component   ***
  object UserService extends UserRepoComponent[UserRepo1] {
    def userRepo = new UserRepo1("main")   // because of def, there is creating new instance of UserRepo1
  }                                        //    with every access to userRepo field

  //***   or as the val   ***
  object StartUserRepo {
    val test_env = new UserRepoComponent[UserRepoMock] {
      val _defaultUserRepo = new UserRepoMock("mock_repo")
      def userRepo = _defaultUserRepo      // here we get the cached instance of UserRepo to avoid creating new instances on each call
    }                                      //    because of using def we can still put some job (eg logging) to userRepo access
  }


  /****************************
   * Functionality 2: User authorization
   * usually we put the functionality inside an abstract component
   * this puts the implementation in coherent namespace:
   */
  trait LoggerComponent {
    val logger : Logger                     // here we are using val
    trait Logger {
      def log(ms: String)
    }
  }

  trait LoggerComponentStd extends LoggerComponent {
    val logger = new LoggerImpl            // val is specified on trait  definition so we can't simply ychanging in runtime,
                                           //     only overriding on creation time.
    class LoggerImpl extends Logger {
      def log(ms:String) = println(ms)
    }
  }

  trait LoggerComponentFile extends LoggerComponent{
    //     we postpone the creation of val logger till creation of whole Application class time
    class LoggerImpl extends Logger {
      def log(ms: String) = println("logging to file: \""+ms+"\"")
    }
    class LoggerMockImpl extends Logger {                        // Mock class Version!
      def log(ms: String) = println("logging to file mock: \""+ms+"\"")
    }
  }

  /****************************
   * Functionality 3: User Service which has Authorization and User Update
   * this functionality depends on other components: UserRepoComponent and LoggerComponent
   */
  trait UserServiceComponent {
    this : UserRepoComponent[_] with LoggerComponent =>   // here we express dependencies
    def authorizator : Authorizator
    val userUpdater = new UserUpdater                     // we set default instance here
    trait Authorizator {
      def authorize(usr: String, passwd: String) : Option[User]
      def change_passwd(u: User, new_pass: String)
    }
    class UserUpdater {                                   // here is the default implementation for UserUpdater
      def update(u:User) = println("User " + u.name + " updated" + u)
    }
  }

  trait UserServiceComponentStd extends UserServiceComponent {
    this : UserRepoComponent[_ <: UserRepo] with LoggerComponent =>
    val company: String       // traits can't has constructor nor parameters. So the abstract fields plays the role of trait parameter
    val authorizator = new AuthorizatorStd(company)
    class AuthorizatorStd(company: String) extends Authorizator {
      def authorize(usr: String, passwd: String) = {
        logger.log("trying to authorize "+usr + " from "+ company)
        if(passwd == "ok") userRepo.find(usr)
        else None
      }
      def change_passwd(u:User, new_pass: String) = println("Password changed")
    }
  }


  /****************************
   * Assemblingy everything together
   */
  object StartCake extends App {
    val service = new UserServiceComponentStd with LoggerComponentFile with UserRepoComponent[UserRepo1] {
      val company = "super company"                // needs to be lazy, since authorizator rely on this and it is aware of initialization order
                                                   // otherwise authorizator could bind company as a null value.
      val logger = new LoggerImpl                  // logger instance was postponed
      def userRepo = new UserRepo1("main")
      override val userUpdater = new UserUpdater   // we can override the default value (actually for simplicity is the same).
    }
    println(service.userRepo.find("marta"))
    service.userRepo.find("robert") match {
      case Some(u) => println("found user robert")
      case None    => println("user robert not found")
    }
    println(service.authorizator.authorize("robert", "ok"))
  }
              
Very good discussion about cake pattern: http://www.warski.org/blog/2011/04/di-in-scala-cake-pattern-pros-cons/.

Constructor based DI

Using Multiple version of some component using DI
The following link contains clean solution about DI implemented by mixing of Cake Pattern and simply constructor / method parameters: http://stackoverflow.com/questions/5190328/can-the-cake-pattern-be-used-for-non-singleton-style-dependencies

Full example showing constructor based DI
which comes form: http://jboner.github.com/2008/10/06/real-world-scala-dependency-injection-di.html


  // UserRepo an UserRepo1 as previous

  // other service
  trait Logger {
    def log(ms: String)
  }

  class LoggerStdOut extends Logger {
    def log(ms:String) = println(ms)
  }

  // =======================
  // service declaring two dependencies that it wants injected,
  // is using structural typing to declare its dependencies
  class UserService(val logger: Logger, val userRepo: UserRepo) {
    def authorize(username :String) = {
      logger.log("trying to authorize "+username)
      userRepo.find(username)
    }
  }

  class Client(us: UserService ) {
    us.authorize("robert")
  }

  // =======================
  // instantiate the services in a configuration module
  object Config {
    lazy val logger = new LoggerStdOut
    lazy val userRepo = new UserRepo1("main")
    lazy val userService = new UserService(logger, userRepo) // this is where injection happens
  }

  new Client(Config.userService)  // running the client code
              

Functional Programming based DI

If we want to be functional, then all dependency should go through functions arguments. Specialization is done with currying.

  // UserRepo and Logger as previous

  trait UserService {
    def authorize(ur: UserRepo, logger: Logger) : String => Option[Unit] = s => {
      logger.log("trying to authorize: " + s)
      ur.find(s)
    }
    def addUser(ur: UserRepo, logger: Logger) : String => Option[User] = s => {
      logger.log("adding user: " + s)
      if (ur find s) None
      else Some(new User(s))
    }
    def sayHello(logger: Logger) : User => Unit = u => logger.log(u.name + " said hello")

    // some test:
    def test(ur:UserRepo) : String => Unit =
      sayHello(addUser(ur, new LoggerMockImpl)( _ ), new LoggerMockImpl)

    // we can do mystically using scalaz:
    val test2 = for {
      au_part <- addUser(_, new LoggerMockImpl)      // make a partial function which will be extracted when applied
      sh_part <- sayHello                            // normal function which will be extracted when applied
      } yield (au_part map sh_part)
  }

  // assembling through partial application
  object UserService1 extends UserService {
    val logger = new LoggerStdOut
    val authorize1 = authorize1(new UserRepo1("main"), logger)
    val addUser    = addUser(new UserRepo1("main"), logger)

    test(new UserRepo1("main"))("test_user")
    test2(new UserRepo1("main"))("test_user")
  }

              

Reflections

Nice text about getting class object from class instance is here

Generic reified

Scala: Abstract Types vs Generics

read: http://stackoverflow.com/questions/1154571/scala-abstract-types-vs-generics

Generic reified

Scala, or JVM has a problem with extending a class by some trait multiple times, but with different parametrization, eg:

  trait Y[T]
  class X extends Y[A] with Y[B]
			
Scala 2.10 is going to have some partial solution.
More on this: http://stackoverflow.com/questions/8605329/reified-generics-in-scala-2-10

Creating jars

creating a jar from a scala file

Omitting dot and parentheses


  "a b c".split(" ").toSeq map ("L"+)  // split returns Array[String] which doesn't have toSeq method, but WrappedArray has.
                                       // This requires implicit conversion from Array to ArrayWrapper toSeq
                                       // but the problem is with `map` function
  ("a b c".split(" ").toSeq map ("L"+)  // Other way to write this expression
		

Omitting brackets for generics

We can omit bracket when specifying generic type:

  trait Handles[-A, -E <: Event]
  class Inventory
  class CreationEvent extends event

  def f(arg: Inventory Handles CreationEvent)    // the same as Handles[Inventory, CreationEvent]
			

Tips for common methods

Circular type dependency

Suppose that in one module we want to create two types which depend on themselves: So the motivation is to ensure at compile time using type system the symmetric dependency.

  abstract class Container[E <: Element[_]] {
    def contains( e: E ): Boolean
    def addNewElement(): Unit
  }

  abstract class Element[C <: Container[_]] {
    def enclosingContainer(): C
  }

  class MyContainer extends Container[MyElement] {
    private var elements = List[MyElement]()
    override def contains( elem: MyElement ) = elements.contains( elem )
    override def addNewElement() { elements ::= new MyElement(this) }
  }

  class MyElement( container: MyContainer ) extends Element[MyContainer] {
    override val enclosingContainer = container
  }
		

Rank-k polymorphism

We have some generic function f: f[T]: T=>T and we want to make function z which takes f as an argument and operate on two parametrized version of f, eg calls f[Int] and f[Double].
The problem is that we need to declare the type of f in z definition.
We simply can't make it like this:

  z[T](f: T => T) = f[Int](1) + f[Double](2.2)     // error!
		
this not works, because we use here f as it has two types (Int => Int, and Double => Double).
The solution is to use class/trait wrapping the generic function:

  trait ForAll {
    def wrapper[X](x : X) : X
  }
  def z(wop : ForAll) = wop.wrapper[Int], wop.wrapper[Double])

  // using:
  def f[T](x: T) = x
  z(new ForAll{def wrapper[X](x : X) = f(x)})
		

Union types

Similar construction, not based on types operation you can find in narrowing types section. Motivation: we want to make construction to express the type to be either T1 or T2 and we call it Union Type.

This can be also useful for method overloading using generic types.

Using generic class with two fields


  case class OrType[A,B](val a: Option[A], val b: Option[B])

  object OrType {
    type or[A,B] = DisjointType[A,B]                             // to type "or" instead of "OrType"

    private def da[A,B](a: A): or[A,B] = { DisjointType(Some(a),None) }
    private def db[A,B](b: B): or[A,B] = { DisjointType(None,Some(b)) }

    // implicit defs - stuttering-or
    implicit def aToOrType2[A,B](a: A): or[A,B] =
      { da(a) }
    implicit def bToOrType2[A,B](b: B): or[A,B] =
      { db(b) }
    implicit def aToOrType3[A,B,C](a: A): or[or[A,B],C] =
      { da(da(a)) }
    implicit def bToOrType3[A,B,C](b: B): or[or[A,B],C] =
      { da(db(b)) }
  }

  // using:
  import OrType._
  class Foo {
    def erasureMethod[T <% String or Int](lt: List[T]) = {
      for (x <- lt) x match {
        case x: String => println("String list item: " + x)
        case x: Int => println("Int list item: " + x)
      }
    }
  }
				

The drawback of this solution is that we have a new class type with two "subtypes" (two fields).
Below is better solution using sophisticated type system constructions.

Using logic construction

We construct Union Type using Curry-Howard isomorphism - to transform types to logic. The built in construct logic operators on a types is inheritance - using with or extends keywords.

  type ¬[A] = A => Nothing
  type v[T, U] = ¬[¬[T] with ¬[U]]                       // DeMorgan law
  type ¬¬[A] = ¬[¬[A]]
  type |v|[T, U] = { type λ[X] = ¬¬[X] <:< (T v U) }

  // Using
  def size[T: (Int |v| String)#λ](t: T) = t match {
    case i: Int => i
    case s: String => s.length
  }

  size(3)                        // returns 3: Int
  size("hej there")              // returns 9: Int
  size(4.2)                      // error: Cannot prove that ((Double) => Nothing) => Nothing >: Nothing with (java.lang.String) => Nothing) => Nothing.
			
Why is the additional |V|? Because implicitly[Int <:< (Int ∨ String)] (asking the compiler if it can prove that Int is a subtype of Int ∨ String) simply not gets true. The left hand of <:< is Int, and the right is a function (because ¬ is a function type). We need to transform right hand of <:< to some other type. That's why we have ¬¬ and |v| types.

More elaboration about union type construction on Miles Sabin blog

How far we can go types expressions

In the propositional calculus it is possible to express all terms using negation and disjunction.
So in Scala it also becomes possible using the definition from above.

Since type level calculations in Scala are Turing complete it should be possible to find type construction corresponding to any recursive function. This means that – in theory at least – Scala's type system is powerful enough to express any type whose set of values is recursive.

To find the construction for any recursive function we can generalize our |v| type constructor to the concept of Acceptor:


  type Acceptor[T, U] = { type λ[X] = ... }
				
and for any function try to construct corresponding type level Acceptor

Unboxed Tagged Types

Motivation: we have some basic structure, that can represents several concepts, eg: Int can be a number seconds from EPOCH as well as the number of seconds from the day beginning.
To avoid mistakes with the Int interpretation we need to occupy type system to control our value interpretation.

We can set some interpretation for value by boxing it into higher type or refine a type

Using boxing

The classic approach is to box Int into two classes which represent day seconds, and epoch seconds. This leads to new type in a class hierarchy and extra memory space.

Tagged types

This approach use type system without creating extra classes.

Quiet good code snippets about unboxing type: https://gist.github.com/89c9b47a91017973a35f.

Unboxed Tagged Types are part of the scalaz7

  type Tagged[U] = { type Tag = U }     // type refinement using type alias
  type @@[T, U] = T with Tagged[U]      // type constructor

  trait Day
  trait Epoch

  type Epochtime = Long @@ Epoch        // type aliases for Long type with tag refinement
  type Daytime   = Long @@ Day

  // conversion functions:
  def daytime(i: Long): Daytime     = i.asInstanceOf[Daytime]
  def epochtime(i: Long): Epochtime = i.asInstanceOf[Epochtime]

  // we can use pimp my library pattern to add extra functionality:
  val hhmmFormat = new SimpleDateFormat("hh:mm")
  case class EpochtimeDisplay(time: Epochtime) {
    // here new Date expects a Long, but this is ok because Epochtime *is* a Long
    def hhmm = hhmmFormat.format(new Date(time))
  }
  implicit def toEpochtimeDisplay(t: Epochtime) = new EpochtimeDisplay(t)

  // using:
  def calculateDay(e: Epochtime) = ...
  val e = epochtime(10231231)
  val d = daytime(2231)
  e.hhmm
  calculateDay(e)                // OK
  calculateDay(d)                // Error
			

Popular data structures

Arrays

new Array[Int](size), Array("mama", "tata"), the last one is an apply method from companion object

indexing is made through apply method call, eg: tab(i)

Lists

Scala has several implementation of Lists: List are covariant, and empty list is type List[Nothing] and Nothing is the bottom type in Scala’s class hierarchy, so for any type A, List[Nothing] is subtype of List[A].

Tuples

var x=(1, "22", 'a'). Access: x._1. It is impossible to make apply method which gets particular field - because such method needs to return different data types (depends on which field returns).

zipped

Tuple.zipped is a function which takes a tuple, and return Zipped instance which has implemented variants of methods map, flatMap, foreach which operates on both elements of tuple (instead of one on single list).

Map, Set

scala.collections.(mutable | immutable).Map, eg: var x = Map(1->"one", 2->"two")

scala.collections.(mutable | immutable).Set, eg: var x = Map(1->"one", 2->"two")

The default types imported with Predef are the immutable types.
For objects of immutable type var doesn't have sense (var x = scala.collections.immutable.Set)

Tips for web development

Web servers



Nice libraries

Scalaz

Scalaz brings to Scala some generic functions and abstractions that are not there in the current Scala API.
Typeclasses are the cornerstone of Scalaz distribution. Instead of thinking polymorphically in inheritance hierarchies, think in terms of designing APIs for the open world using typeclasses. Scalaz implements the Haskell hierarchy of typeclasses - Functors, Pointed, Applicative, Monad and the associated operations that come with them.

Introduction to basic algebra structures

An algebra structure is A=(G, O) set of common objects O, and a operators G (functions) on this objects. Operators are closed under G (every call of the operator gives an element from G)
Semigroup
Is an algebra which has associative binary operation.
Monoid
Is an semigroup which have a neutral element e in G, for given semigroup binary operator.
Group
Is an Monoid which have an inverse Inv operator for every element in G, such that Inv(g)+g=e, where e is neutral element and + is binary operator given from semigroup.
Functor
Is an algebra supporting "simple kind homomorphism" operations.
In Functional programming it means, high order structures (containers) that supports map operator, which maps function to every element in the structure.
Monads
high order structures which are Functor and has "flatMap" like method

The principals behind Scalaz

Scalaz introduces 3 main kind types (the interpretation of "normal" types): Identity[_], MA[_,_], MAB[_,_,_]:
  • A ~> Identity[A] which means that any non-generic type a is a kind of Identity[_] - there exists implicit conversion from A to Identity[A] (like Int, String, ...)
  • M[A] ~> MA[M,A] - any generic type with one type parameter is a kind of MA[_,_] - there exists implicit conversion from M[A] to MA[M, A] (like Set, List, Option...)
  • M[A, B] ~> MAB[M,A, B] - any generic type with two type parameters is a kind of MAB[_,_,_] - there exists implicit conversion from M[A,B] to MAB[M, A, B]
Monoid
Identity, MA, MAB kinds are accepted to be Monoid.
The examples below show some monoids (type, operation, neutral element on the operation):
  • (Int, +, 0); (Int, *, 1)
  • (Boolean, + == `or`, false); (Boolean, `and`, true) - scalaz introduce BooleanConjunctino for the last type, for which + is default to `and`
  • (Boolean, + == `or`, false); (Boolean, `and`, true) - scalaz introduce BooleanConjunctino for the last type, for which + is default to `and`
  • (Function2, + == `andThen`, id);
  • (Option, + == `or`, None);
Scalaz kinds are monoids if:
  • Identity[A] if A is Monoid
  • MA[M, A] if A is Monoid
  • MAB[A, B] if B is Monoid
So for example Function2[A,B] is monoid if return type (B) is monoid.
I will present some useful operators on monoids, which are defined by scalaz:
|+|
Scalaz defines |+| for default operation on monoid (eg: for Boolean `or`)

With monoid we can perform very useful operations. For example:


  trait TradingPosition{
    def sym: Ticker
		def qty: Int
  }

  val f_london = (_ : TradingPosition).sym.id endswith ".L"
  val f_ny = (_ : TradingPosition).sym.id endswith ".O"

  val positins: Seq[TradingPosition] = get_positions_from_db("mydb")
  positions filter (f_london |+| f_ny)                  // returns positions from london or ny
					
~
~ is a "zero operation"

  val pos_map: Map[Trade, Int]  = get_positins...      // other view to look at the position: map from ticker to quantity

  // With new tread we want to increase the quantity in pos_map
  def newTrade(trd : Trade): Unit =
    pos_map += (trd.sym -> (pos_map.get(trd.sym) getOrElse 0) + trd.qty)
    // the previous statement can be simplified to:
    pos_map += (trd.sym -> ~pos_map.get(trd.sym) |+| trd.qty)  // this has other advantage, that Int type doesn't appear hear
                                                                // so we can safely change representation from Int to pair of Int's
					
Function wrappers
Scalaz defines function wrappers for Function0, Function1, Function2 structures: Function0W,Function1W, Function2W Function1W, and Function2W define some interesting methods:
Function1W, Function2W .lift
Lift the function to higher kind Monad like, so that we can apply the function A=>B on collection (Functor) of A:
f.lift[Container_type] apply container means container map f

  val g = (_:Int)+1
  g.lift[List] apply List(2,3) assert_=== List(3,4)
  g.lift[List].second apply (1, List(2,3)) assert_=== (1, List(3,4))  // see second definition in Array type below

  val f = (a:Int) => (a, List(a+1,a+2))
  f andThen g.lift[List].second apply 1 assert_=== (1, List(3,4)
                                                     // here we compose f and g to second element of f result
  f(1):-> g.lift[List] assert_=== (1, List(3,4))     // the same effect as above, see :-> definition below
					
Arrow
Arrow is an wrapper to an function. Arrow expose two important functions: first, second which expects a pair as an argument and apply the function to first, or second element of the pair. There exists implicit conversion from Function1 to Arrow.
Arrow.first
((_:Int) + 1).first apply (7, "abc") assert_=== (8, "abc")
Arrow.second
((_:String) + 1).second apply (7, "abc") assert_=== (7, "abc1")
MAB kind
MAB representative is a Function1[_,_] type, Pair, Either
MAB.>>>
((_: Int)+ 2) >>> (_*3) apply 2 assert_=== 12
MAB.<<<
((_: Int)+ 2) MAB.<<< ((_:Int)*3) apply 2 assert_=== 8
MAB.&&&
((_: Int)+ 2) &&& (_*3) apply 2 == (4,6)
MAB.***
((_: List[Int]):+ 3) *** ((_:Int) + 10) apply (List(1,2), 7) == (List(1,2,3), 17)
MAB.product
((_:Int) + 1).product apply (9, 99) assert_=== (10, 100)
MAB.:->
This operate only on BiFunctor type (Functor ~ Container of only two elements, like Pair, Either..)
(1,2) :-> (_ * 2) assert_=== (1, 4)
((_:Int) * 2) <-: (1,2) assert_=== (2, 2)
(Left(2): Either[Int, Int]):->(_*2) assert_=== Left(2)
((_:Int)*2) <-: (Left(2): Either[Int, Int]) assert_=== Left(4)
(Right(2): Either[Int, Int]):->(_*2) assert_=== Right(4)

Strings in Scalaz

Scala usually operates on List[Char] instead of String. To make use of String in Scalaz instead of converting List[Char] to String, we need to explicitly put some implicit objects to method call.
Some explanation is at http://stackoverflow.com/questions/7631844/string-seen-as-a-monoid

nice articles, tutorials about scalaz:

Sing

sing is a type-level metaprogramming library for Scala. This is based upon the singleton type system emulation. "sing" represents "singleton" and "compile-time and runtime methods sing in chorus":

Processing xml with Scala

Good introduction is hera.
Mark Feeney covers quick intro about reading xml files in his article.
Overwiev of the library on the code commit

Nice Scala content

Tools

My number 1 development configuration is Emacs + Ensime + scamacs (package with preconfigured build of emacs ecb) + JRebel. The build tool is SBT which is handled by Ensime. But usually I use SBT in separate xterm session - other window, more clean visibility in project.

But for those preferring simply without special emacs/vim abilities I would recommend IntellJ + SBT + sbt-idea (plugin for sbt) + sbt plugin for intellJ. I prefer IntellJ configuration then Eclipse, because of nice configuration of environment. It don't depends on internal Scala environment so I can use my general Scala environment, or what is better - easily manage sbt as a build tool. Then all dependencies and build process will be managed by SBT while coding and interface development will be managed by IntellJ.
If you want to use IntellJ you can track intelliJ Scala plugin blog which publish nice update information. When I tried to configure Eclipse it was like war with the hell - I couldn’t figure how can I change my Scala version or easy make run / build configuration (eg: to use sbt for building, and specify run configuration for output of sbt).

FSC

Scala compilation process usually takes a long time. The reasons are: JVM startup, Scala compiler warmup and time to load and JIT Scala libraries.

To speed up compilation process we can use fsc which is bundled with standard Scala. On the first time it runs standard Scala, make whole warmup and loading and stay detached as a daemon process in background. So in following fsc call, it will reuses the same compiler instance without spending whole time to load and JIT.

fsc is usually supported by new IDEs (IntelliJ) - more info on reuses the same compiler instance

SBT

Simple Build Tool - must have and must learn for Scala developer.
Simple for building, simple for manage versions/dependencies, simple for running (servers, complicated environment configurations). Lots of plugins. Easy to write ones... To start using SBT check getting started page.

SBT only recompile sources that are out of date.
Unfortunate (as of version 0.11) sbt doesn't use fsc to speed up compilation, but it uses the same JVM on each compilation process. So it avoids JVM startup overhead, but still needs a time for Scala compiler warmup and JIT libraries.

As of 10.x and 11.x version SBT doesn't have built in task to setup new project. To start SBT just go to your project catalogue and run SBT and type task to preform.
The most basic tasks are

  • compile to look for the source files and compile them. sbt will compile only those with new changes.
  • run looks for class with main function and run it
As stated in getting started pages, SBT by default checks ./; src/main/scala; src/main/java subdirectories to find source files (eg: for compile task) and uses target/<scala-version>/... for .class files. All of this, and more (setting project version, scala-compiler version, compiler options, dependencies, classpath, owner/organization, copyright, packaging options ...) can be specified in build.sbt and project/<.scal or .sbt> files. Further specification is on Wikipedia.

SBT perform dependency management as well as automated downloading of missing.

SBT performs excellent as a build tool and running scala applications and scripts! With SBT you don't need any Scala build - just SBT. It will download and manage anything else you need

SBT plugins
They are use to add new tasks and functionality for sbt. I will mention only couple of interesting plugin:
np (New Project)
SBT lack a wizard task to make new project. Here np plugin comes. Simple utility for creating new projects in sbt. When active, in new folder run:

  $ sbt
  $ np name:my-sub-project dir:sub-project-dir     
This will create a new sbt project source tree for a project named my-sub-project under the directory named sub-project-dir relative you your projects base directory.
coffeescripted-sbt
CoffeScript is an programming language which is translate to JavaScript. It is very pleasant to use.
The plugin compiles your CoffeeScripts so you don't have to.
ensime-sbt
Adds command ensime generate to sbt which generates .ensime project file. Similar functionality from ensime is really restricted and don't parse sbt builds file (just detects them).
sbt-idea
This adds task to make an idea project from existing sbt project and configure it to use libraries from sbt cache
Type gen-idea [with-classifiers | no-classfiers | no-sbt-classifiers] sbt task to create IDEA project files. By default, classifiers (i.e. sources and javadocs) of sbt and library dependencies are loaded if found and references added to IDEA project files. If you don't want to download/reference them, use command gen-idea no-classifiers no-sbt-classifiers.
posterous-sbt
automatic publishing release notes to any Posterous site based on current version and notes in notes/<version>.markdown information.
SBT tools
SBT can be configured to perform as a "different application".
Instruction how to configure the behaviour of SBT read SBT Launcher wiki.

Some interesting tools build on top of SBT.

giter8
is an command line tool which check for a skeleton of some project configuration you might want to use. It simply search repository for template to use and download it to your localhost.

After running g8 it will output usage. It is as simple as g8 -l to list templates and g8 repo_name/template_name to download template.

ls
A card catalogue for Scala libraries. Used as a tool based on sbt or web application at http://ls.implicit.ly/.
Encourage to share information about self projects to ls.implicit.ly site (information on the site, on publishing section)

Ensime

Ensime is the ENhanced Scala Interaction Mode for Emacs. Really great extension to Emacs lovers.
It has support for sbt!

The simplifies .ensime configuration file to work with single Scala file (no directory structure).


  ( :source-roots (".") )
        
There is also minimal ensime sbt project.

Below I present some tips in configuration ensime

Enable semantic highlighting

By default Emacs uses syntactic highlighting in sources. Because a syntax highlighter cannot tell whether a symbol is val or var, or a method call it is recommended to enable the semantic highlighter by the snippet below to your .emacs file.


  (setq ensime-sem-high-faces
      '(
        (var . (:foreground "#ff2222"))
        (val . (:foreground "#111111"))
        (varField . (:foreground "#ff6666"))
        (valField . (:foreground "#666666"))
        (class . font-lock-type-face)
        (trait . (:foreground "#084EA8"))
        (object . (:foreground "#026DF7"))
        (package . font-lock-preprocessor-face)))
        (param . (:foreground "#111111"))
        (functionCall . (:foreground "#84BEE3"))
        
Extending source roots

To link external sources (eg to extend type inferencer or symbol autocompleter) we need to extend :source-roots variable. We can do it in .ensime file or load manually in external script.


  key(":source-roots"), sexp(
          "/opt/scala/proj/play/src/scala",
          "/opt/scala/proj/scalaz/src/scala",
        )
      
Linking to external documentation

We load external documentation by writing extractor function (which extracts appropriate files) and append it to ensime-doc-lookup-map list


  (defun make-play-doc-url (type &optional member)
    (ensime-make-java-doc-url-helper
      "file:///opt/scala/proj/play2/doc/api/scala/" type member))

  (add-to-list 'ensime-doc-lookup-map '("^play\\.api\\." . make-play-doc-url))
      

JRebel

JRebel is a JVM-plugin that makes it possible for Java developers to instantly see any code change made to an app without redeploying. JRebel lets you see code changes instantly, versioning classes and resources individually and updating one at a time instead of as a lump application redeploy. When developers make a change to any class or resource in their IDE, the change is immediately reflected in the deployed application, skipping the build and redeploy phases and preventing an average of 5.25 work weeks per year in redeploys. In June 2011, JRebel was recognized as "Most Innovative Java Technology" by the JAX Innovation Awards.

JRebel is an alternative solution to updating classes introduced in JVM 1.4 as a hot swapping feature that allows developers to update the code, limited to exists method bodies only, on-the-fly during debugging. JRebel does not require a debugging session to be started. Instead it monitors the file system for changes and updates the classes in-memory. This means that only classes compiled to ".class" files will be updated and changes to classes in JAR files will be ignored. JRebel imposes a performance overhead on the application and should not be used in production or performance tests. It is meant to be a development tool only.

To use JRebel in your sbt project add the following options to java in sbt build file:


  -noverify -javaagent:/path/to/jrebel/jrebel.jar
        
Automatic reload using SBT
SBT will allow you to signal a task to perform when file change, for example to restart the application when it detects code changes. The files and directories monitored can be configured - for example to if you are using JRebel you might to change the monitored content.
More about triggers on wiki: sbt wiki. On old sbt wiki you can find some examples for web applications, eg:

  > jetty-run
  > ~ prepare-webapp
        
jetty-run starts Jetty and monitors the directories given by scanDirectories and redeploys on changes. By default, the entire temporary web application directory is monitored. You might want to change scanDirectories in some cases. For example, set scanDirectories to Nil if you do not want to redeploy on any changes
~ prepare-webapp recompiles and recreates the web application whenever sources files change

Tutorials

Blogs

Scala links

other nice links