Chapter 1
Language Features

Scala borrows many of its syntax and control structures from various other languages. Methods are declared in Algol/C style with the addition of features such as optional braces, and semicolons to reduce code boilerplate.

Scala syntax also has features such as expressions for statements, local type inference, suffix/infix notation, and implicit return statements, which make typical idiomatic Scala code look like dynamic languages such as Lisp/Python/Ruby, but with the added benefit of strong static typing.

The expressive nature of Scala's syntax is useful to create custom DSL's, with the most notable example being Scala's own XML library. Scala also features several forms of inheritance (traits, classes, objects), which can be combined in various ways to structure composition in your code in a more natural and elegant way. This combined with a strong support for functional programming, syntactic sugar (for comprehension), and type inference makes it possible to create very terse domain specific code that is also type safe.

This chapter provides a broad overview of the various parts of Scala's design and syntax, as well as the features that are expected in modern generic programming languages (control of scoping, string interpolation, encapsulation, and modules).

STATIC TYPES AND TYPE INFERENCE

Scala is first and foremost a statically typed language, which is similar to other statically typed ­languages you can annotate types for:

  • Variable declarations
  • Method/function arguments
  • Method/function return values
  • Various types of data structures

Below is a code sample to demonstrate these types of annotations:

val s: String // type definition on variable declaration
def doSomething(s: String) // type definition in a parameter
def doSomething: String // type definition for return type of a function
class SomeClass(s: String) // type definition in a class constructor definition
type MyString = String // type aliasing

A significant reason why Scala requires mandatory type signatures is that it's very difficult to provide a practical implementation of global type inference in a statically typed language that supports subtyping. Thankfully, while Scala mandates that you provide type signatures for method parameters and type definitions, for variable declarations inside function/method bodies it features local type inference. This drastically reduces the amount of boilerplate code when it comes to function/method bodies. Here is an example of local type inference in action.

def doSomething(input: String) = {
val l = List(1,3,5)
val summed = l.sum
val asString = summed.toString + input
asString.toUpperCase
}

As you can see for variable declarations, you don't need to specify their respective types. To ­manually specify the types, do the following:

def doSomething(input: String): String = {
val l: List[Int] = List(1,3,5)
val summed: Int = l.sum
val asString: String = summed.toString + input
asString.toUpperCase()
}

One bonus of specifying types for function arguments is that they also act as a form of ­documentation, which is particularly helpful. Return types of functions are also optional; however, it is encouraged to annotate them for documentation/clarity reasons. Due to Scala's generic code functionality, it is possible to have expressions that return different values in ­unexpected scenarios. One good example is Int's versus Double:

def f = 342342.43

def f_v2 = 342342

f == 342342 // returns true

f_v2 == 342342 // return false

Since f and f_v2 don't have their return types annotated, we aren't aware of what types they can return (let's assume you don't know what the actual values of f and f_v2 are). This means that the f == 342342 expression returns something else compared to f_v2 == 342342. If, however, you know the return types of f and f_v2, you can do the following:

def f: Double = 342342.43

def f_v2: Int = 342342

As a user, you know that the return type of f is a Double, so don't compare it with an Int when asking for equality ( since it's a Float by extension you should provide custom equality with a delta).

Manual type signatures can also be used to force the type of a declaration. This is similar to the typecast feature in languages like Java or C; however, it's more powerful due to type conversions available in Scala. A simple example in Scala is to assign a Long type to an integer literal.

val l: Long = 45458

By default, if you use a number literal in Scala it defaults to an Int. Note that it is also possible to use the number based suffix like so:

val l = 45458L

Arguably, the type annotation is more idiomatic, especially for custom types that may not have literal operators.

Implicit Parameters, Conversions, and Their Resolution

Scala has an incredibly powerful feature called implicits. When you ask for a value implicitly, such as a method parameter, or the implicit keyword you are telling the Scala compiler to look for a value that has the same type in scope. This is unlike the explicit value, where you need a type to specify when it's called/used. A practical use case for implicits is providing an API key for a REST web service. The API key is typically only provided once, so an implicit is a good use case in this instance, since you only need to define and import it once.

case class ApiKey(id: String)

Since implicits are found by their type you need to create a new type, in this case ApiKey. Now let's define a method that makes a HTTP call.

import scala.concurrent.Future

def getUser(userId: Long)(implicit apiKey: ApiKey): Future[Option[User]] = ???

In order to call this function you must make sure that an implicit ApiKey is visible in the scope. Let's first make an instance of ApiKey and place it in an object.

object Config {
       implicit val apiKey = ApiKey(System.getenv("apiKey"))
}

Whenever you call getUser, ApiKey needs to be visible, and the code should look something like this:

import Config._ // This imports everything in Config, including implicits

object Main extends App {
val userId = args(0).toLong
       getUser(userId)
}

It's also possible to supply an implicit argument explicitly.

object Main extends App {
  val userId = args(0).toLong
  val apiKey = ApiKey(args(1))
  getUser(userId)(apiKey)
}

Implicits can be used to make automatic conversions between types; however, such usage is considered advanced and should be done sparingly, because it's easy to abuse code that uses these features. A prominent use of implicit conversions is in DSLs. For example, if you have a trivial DSL that attempts to represent SQL queries, you might have something like this:

Users.filter(_.firstName is "Bob")

In this case, _.firstName may be a different type from a String (in this example, let's say it has type Column[String]). And it can only work on types of Column[String]:

val columnStringImplementation = new Column[String] {
       def is(other: Column[String]): EqualsComparison = ???
}

To compare two instances of Column[String] (to see if they are equal) you want an implicit con­version of String to Column[String], so that you don't have to write:

Users.filter(_.firstName is ColumString("Bob"))

To do this you need to define the following:

implicit def stringToColumnStringConverter(s: String): Column[String] = ???

In this case, when you import your stringToColumnStringConverter, it will automatically convert any instances of String to Column[String] if it's required to do so. Since the is method only works on Column[String], the compiler will try to see if there is an implicit conversion available for the String “BoB,” so it can satisfy the method signature for is.

Implicit classes are a type safe solution that Scala provides for monkey patching (extending an existing class, or in Scala's case, a type without having to modify the existing source of the type). Let's try this by adding a replaceSpaceWithUnderScore method to a String.

object Implicits {
  implicit class StringExtensionMethods(string: String) {
    def replaceSpaceWithUnderScore = string.replaceAll(" ","_")
  }
}

Like the previous example, you need to import this class to make it available:

import Implicits._
"test this string".replaceSpaceWithUnderScore // returns test_this_string

Case Class, Tuples, and Case Object

Scala has a feature called case class, which is a fundamental structure to represent an immutable data. A case class contains various helper methods, which provide automatic accessors, and methods such as copy (return new instances of the case class with modified values), as well as implementations of .hashCode and equals.

case class User(id: Int, firstName: String, lastName: String)

User(1,"Bob","Elvin").copy(lastName = "Jane") // returns User(1,"Bob","Jane")

These combined features allow you to compare case classes directly. Case classes also provide a generic .toString method to pretty print the constructor values.

User(1,"Bob","Elvin").toString // returns "User(1,Bob,Elvin)"

Scala also has tuples, which work similarly to Python's tuples. A nice way to think of tuples is a case class that has its fields defined positionally (i.e., without its field name), and there is no restriction on a tuple requiring the same type for all of its elements. You can construct tuples by surrounding an expression with an extra set of parenthesis:

val userData = (5,"Maria", "Florence") //This will have type Tuple3
 [Int,String,String]

You can then convert between tuples and case classes easily, as long as the types and positions match:

val userData2: Tuple3[Int,String,String] = User.unapply(User(15,"Shelly","Sherbert")).get
val constructedUserData: User = User.tupled.apply(userData2) // Will be
User(15,"Shelly","Sherbert").

Tuples are a handy way to store grouped data of multiple types, and they can be destructured using the following notation:

val (id, firstName, lastName) = userData2

Note that tuples also provide an _index (where index is the position) and methods to access the elements of the tuple by position (in the example above, ._2 would return “Shelly”). We recommend that you use the deconstruction mentioned above, since it's clearer what the field is meant to be.

For internal and local use, tuples can be quite convenient due to not needing to define something like a case class to store data. However, if you find yourself constantly structuring/destructing tuples, and/or if you have methods returning tuples, it's usually a good idea to use a case class instead.

Abstract Class, Traits, and Sealed

In the previous chapter we discussed case classes. Apart from immutability (and other benefits), one of the primary uses of case classes is for creation of ADTs (known as Algebraic Data Types). Algebraic data types allow you to structure complex data in a way that is easy to decompose, and it is also known as the visitor pattern in Java. Here is an example of how you can model a video store:

sealed abstract class Hemisphere
case object North extends Hemisphere
case object South extends Hemisphere

sealed abstract class Continent(name: String, hemisphere: Hemisphere)

case object NorthAmerica extends Continent("North America", North)
case object SouthAmerica extends Continent("South America", South)
case object Europe extends Continent("Europe", North)
case object Asia extends Continent("Asia", North)
case object Africa extends Continent("Africa", South)
case object Australia extends Continent("Australia", South)

You can then easily use Pattern Matching (explained in greater detail in the next section) to extract data, and doing so looks similar to this:

val continent: Continent = NorthAmerica
continent match {
case Asia => println("Found asia")
case _ =>
}

You may also notice the keyword “sealed.” Sealed in Scala means that it's not possible for a class/case class/object outside of the file to extend what is sealed. In the example above, if Continent was defined in a file called Continent.scala, and in Main.scala you tried to do the following, Scala produces a compile error.

case object Unknown extends Continent

The advantage of sealed is that since the compiler knows all of the possible cases of a class/trait/object being extended, it's possible to produce warnings when pattern matching against the cases of the sealed abstract class. This also applies to Traits, not just sealed abstract classes.

Traits that allow Scala to implement the mixin pattern are another one of Scala's most powerful features. Traits, at a fundamental level, allow you to specify a body of code that other classes/traits/objects can extend. The only real limitation is that Traits can't have a primary constructor (this is to avoid the diamond problem). You can think of them as interfaces, except that you can provide definition for certain methods that you wish (and you can extend as many traits as you want).

A good use case for a trait is in a web framework, where traits are often used as a composition model for routes. Suppose you have trait defined as the following:

trait LoginSupport {self: Controller =>
       lazy val database: Database
       def login(userId) = {
            //Code dealing with logging in goes here
            afterLogin()
}
def logout() = {
      //Code dealing with logging out goes here
}
def afterLogin: Unit = {
}
}

Here a trait is defined that allows you to mixin login/logout functionality, but there are a few interest points here. The first is the self: Controller =>. This dictates that only a class of type Controller can extend the LoginSupport trait. Furthermore, there is a reference to this type through the self variable. This is actually an example of inversion of control, which is a basic technique needed to do simple DI (Dependency Injection) in Scala. This form of DI is statically checked, and doesn't need any dependency on external libraries, or features like macros.

The next important thing to note is the val database: Database. It's up to the class/trait/object extending LoginSupport to provide an implementation of database, which is ideally what you want (the LoginSupport trait shouldn't need to know how the database is being instantiated).

def login and def logout provide implementations for login and logout respectively. You then have an implementation of afterLogin(), which is called after you login() a user. Now let's assume you have a controller, or something like:

class ApplicationController(implicit val database: Database) extends LoginSupport {self:
Controller =>
}

Inside this class, you now have access to the login, logout, and afterLogin methods. Doing just the above however, would generate a compile error, since you haven't defined the database. Using an assumption that the database is being passed into the ApplicationController (let's say implicitly), you can do this:

class ApplicationController(implicit val database: Database) extends LoginSupport
{
}

You can provide an instance of the database as a constructor for ApplicationController, and it will be used by the LoginSupport. Alternatively, you can do this:

class ApplicationController2 extends LoginSupport {self: Controller =>
lazy val database: Database = Config.getDb
}

Similar to extending in Java classes, you can actually override one of these methods. In this case you can execute something after the user logins:

class ApplicationController3(implicit val database: Database) extends LoginSupport{self:
Controller =>
override def afterLogin() = Logger.info("You have successfully logged in!")
}

As you can see, traits are incredibly flexible. They allow you to cleanly split responsibility (i.e., what is implemented by the trait versus what is needed, but not implemented).

PATTERN MATCHING

Pattern matching is one of the most used features in the Scala language. It provides a Swiss Army knife of capabilities that are the most common ways of inspecting and dealing with object data.

Pattern matching has the ability to deal with the following:

  • Equality comparison. Similar to the switch statement, pattern matching allows you to branch out different executions, which depend on the value of a variable. Scala's pattern matching, however, has far better capabilities than just equality checking. It can be combined with if statements to allow for more fine-grained branching, and it's also the primary way to deconstruct ADTs (Abstract Data Types).
  • Typesafe forced casting. Pattern matching can be used to do safe type casting. In Java, manual checks with isInstanceOf need to be made to ensure that safe cast is being applied. Pattern matching allows you to do this pattern in a safe way.
  • Destructive assignment of case classes and types that provide Unapply. Destructive assignment, loosely speaking, is the ability to inspect the contents of a type and act on them as they are being deconstructed. This is also used to deal with Option in a type safe way and can aid construction of immutable data structures such as a List (reference link).

The beautiful thing about pattern matching is that all of the above capabilities can be combined to deconstruct complex business logic, as this example shows:

sealed abstract class SomeRepresentation
case class NumberRepr(number: Double) extends SomeRepresentation
case class StringRepr(s: String) extends SomeRepresentation
val s: SomeRepresentation = NumberRepr(998534)
s match {
case NumberRepr(n) if n > 10 => println("number is greater than 10, number is $n")
case NumberRepr(n) => println("number is not greater than 10, number is $n")
case StringRepr(s) => println("is a string, value is $s")
}

This example demonstrates switching on an ADT, destructuring the values inside the ADT, and also matching on those destructured values. Pattern matching has the wildcard operator (_) which is the fallback if all other options are exhausted (in Java this is default). A good example of this is an actor:

case class StringMessage(s: String)
case class IntMessage(i: Int)
class SomeActor extends Actor {
      def receive = {
          case StringMessage(s) => println("message received, it's a string
            of value $s")
          case IntMessage(i) => println("message received, its an int of value $i")
          case _ => println("unknown message")
}
}

Another use of pattern match is in the construction of partial functions, which is particularly powerful when combined with methods like collect:

val l: List[Any] // List of values/references of any possible type
l.collect{
       case i: Int if i % 2 == 0 => i
       case s: String => s.length
}

In the case above, collect will grab all items from the list only if they happen to be an even number of type Int OR if the item is of type String, then the length of the Stringis returned. Pattern matching also provides a typesafe way of dealing with Option values.

val param: Option[String
param match {
       case Some(s) => "parameter found, its value is $s"
       case None => "no parameter is found"
}

Statements Are Expressions

In Scala, any statement is also an expression. This is particularly powerful, since it represents a consistent way of working with branching of expressions (especially when combined with pattern matching as described before).

import scala.concurrent.Future
val parameter: Option[String] = ???
val httpCall: Future[String] = parameter match {
case Some(s) => Http.get("/api/doSomething/$s")
case None => Future("No parameter supplied")
}

The important thing to note here is that you are making sure that the return result of the statement is of type Future (so you can reuse this as a proper value later on).

String Interpolation

String interpolation provides a performant, concise, and typesafe way to print out values in the middle of string literals. The basic way to use the $ is to print the value of a variable:

s"the value of this variable is $s"

It's also possible to use block syntax so you can interpolate expressions, and not just values:

log.info(s"The id of the current user is ${user.id}")

Scala Collections, immutable and mutable

In the previous example, you can see the usage of methods like .map and .flatMap. These are actually fundamental patterns in functional programming, and they are used frequently in Scala as part of its collection framework. Scala collections are one of the few languages that provide a generic, strongly/statically typed framework for collections that also deal with type transformations. Let's start off with one of the most fundamental types in Scala, the List.

A List in Scala represents an immutable List, whereas a Seq represents an arbitrary sequence (can be mutable or immutable). Below is an example of how to use list:

val l = List(3,6,2342,8)

The type of this expression is List[Int]. One thing to note is that you can use the Seq constructor like so:

val s = Seq(3,6,2342,15)

Seq's default constructor is an immutable List, so in fact these expressions are equal. However, as stated before, a Seq can represent any sequence whereas a List is much more strict.

def listLength(l: List[_]) = l.length
def sequenceLength(s : Seq[_]) = s.length

It is perfectly legitimate to pass any sequence into sequenceLength (this can mean a Vector, a mutable list, or even a String), however, if you try to pass these types in listLength, you get a compile error. This is very powerful, as it allows you to specify the required granularity that is needed for collections, but it can also be dangerous. Consuming a list, while doing the map that is mutable versus immutable, can have unintended consequences.

.map allows you to apply an anonymous function to every element in a collection, with the item being replaced with the returning value of that anonymous function.

val l = List(3,6,2342,8)
l.map(int => int.toString) // Result is List("3","6","2342","8")

The example above converts every number to a String, which shows you how Scala on a simple level deals with converting types from within a collection (in this case Int -> String). One of the great things about the Scala collection library is that it provides implicit conversions that allow you to do complex transformations.

val m = Map(
"3" -> "Bob",
"6" -> "Alice",
"10" -> "Fred",
"15" -> "Yuki"
)

This Map will have the type Map[String,String]. However, let's say that you want to convert the keys to a String, while at the same time splitting out the names into its characters (i.e. Array[Char]).

A first naive solution to the problem can be:

val keys = m.keys.map{key => key.toInt}.to[List
val values = m.values.map{name => name.toCharArray}
(keys zip values).toMap

As you can see, this attempt is quite wordy, and it's also inefficient. You have to manually get the keys/values out of the map and transform them. Then you have to manually zip the keys with the values and convert them to a map. A more idiomatic solution is:

m.map{case (key,name) => key.toInt -> name.toCharArray }

This version is definitely more readable and succinct than the former, but you may be wondering how this works behind the scenes. Below is the definition of map from the Scala collections library.

def map[B, That](f: A => B)(implicit bf: CanBuildFrom[Repr, B, That]): That

While this looks quite complex, the thing to note here is the implicit bf: CanBuildFrom[Repr, B, That]. This brings an implicit CanBuildFrom, which allows the Scala collection library to transform between different types to produce the desired results.

In this case, when you call the .map on the Map[String], the argument to the anonymous function is actually a tuple (which you destruct with the case statement).

m.map{x => …} // x is of type Tuple[String,String], with the first String being
  a key, and the second value
m.map{case (key,value) => … } // Deconstruct the Tuple[String,String
  immediately. key is of type String, as is value

If you return a tuple in the anonymous function that you place in map, CanBuildFrom will handle the conversion of Tuple2 as a key-value pair for the Map you originally operated off. The -> syntax you noticed earlier is actually an alias to construct a tuple:

implicit final class ArrowAssoc[A](private val self: A) extends AnyVal {
    @inline def -> [B](y: B): Tuple2[A, B] = Tuple2(self, y)
    def →[B](y: B): Tuple2[A, B] = ->(y)
}

In other words, key.toInt -> name.toCharArray is the same as (key.toInt,name.toCharArray). This means the return type of {case (key,name) => key.toInt -> name.toCharArray } is Tuple2[Int,Char[Array]], so all that happens is you go from Map[String,String] to Map[Int,Array[Char]] in the final expression.

You may ask at this point, what would happen if you return a completely different type, such as if you return an Int instead of a Tuple2[A,B] (where A and B are arbitrary types). Well, it so happens that the map has its (key,value) entry replaced with just a value, so you end up converting from a Map to an Iterable.

This means that m.map{case (_,value) => value } is the same as m.values, and m.map{case (key,_) => key } is the same as m.keys. A lot of these powerful transformations are possible due to CanBuildFrom, so you can implement equivalent functionality in other strongly, statically typed languages. This needs to be implemented into the actual compiler (in Scala, the collection is a standard Scala library that is included by default in the Scala distribution).

For Comprehension

You may have noticed earlier that when you composed map's and flatMap's, you end up with undesirable nesting (i.e., a horizontal pyramid). for comprehension is a very powerful feature to deal with such issues.

import scala.util._

def firstTry: Try[String] = Success { "first response" }
def secondTry(string: String): Try[Int] = Failure { new
IllegalArgumentException(s"Invalid length ${string.length}")
}
def finalTry(int: Int): Try[String] = Success { int.toString }


firstTry.map{result =>
  s"value from firstResult: $result"
}.flatMap{anotherResult =>
  secondTry(anotherResult).map{finalResult =>
    finalTry(finalResult).map(_.toUpperCase)
  }
}

The equivalent for comprehension for this statement would be:

for {
       result <- firstTry
       anotherResult = s"value from firstResult: $result"
       secondResult <- secondTry(anotherResult)
       finalResult <- finalTry(secondResult)
} yield finalResult.uppercase

The for comprehension has completely flattened out the above comprehension, making the intention very clear. for comprehension also provides syntactic sugar over filter/withFilter. A good example of this is shown using the Range class:

for (i <- 1 to 10000 if i % 2 == 0 ) yield i

This provides you with the all of the even numbers up to 10,000. The equivalent without the for comprehension is:

 (1 to 10000).withFilter(i => i % 2 == 0)

PACKAGES, COMPANION OBJECTS, PACKAGE OBJECTS, AND SCOPING

Another fantastic feature of Scala is how it provides access controls and encapsulation in every vertical of the language. This goes from controlling scoping on the package level to controlling scoping down to a module level.

At the highest level, Scala provides packages that work the same way they do in Java. In essence, packages are a purely static construct, which the compiler can use to group a collection of .scala source files. Packages can't be referenced in runtime apart from providing them as an import. So, packages are often used to collect source files on a very high level. As an example, the Scala Futures are contained within scala.concurrent, with scala.concurrent being the package.

The benefit of packages is that they are purely static, and packages can be combined with completely separate dependencies as long as there are no conflicts in naming. As an example, if you want to make your own version of Future (let's call it ImprovedFuture), you can do the following:

package scala.concurrent

trait ImprovedFuture {
      // Implementation goes here
}

If you package this as a dependency, and include it in one of your projects, you can import both the standard Future, our ImprovedFuture, and everything else under the namespace scala.concurrent.

The other namespacing utility that Scala has is called objects. In contrast to packages, objects have an actual runtime type representation. Another common name for objects is singletons. A singleton is a class that can only ever have one global instance, which is also automatically instantiated.

object MyObject {
     // Implementation goes here
}

MyObject has a type called MyObject$ at runtime. In actual Scala code, this type can also be retrieved by calling the .type method (i.e. MyObject.type). This pattern essentially provides “dynamic” modules, or in other words, allows you to deal with packages within your own code. For example, you can make a function that accepts MyObject and does something with it:

def doSomething(myObject: MyObject)

A more realistic example is to create a module that a function can work with. As an example, let's create a trait that defines logging:

trait LoggerImplementation {
      def publishLog(level:String, message: String)
}

Now since this is just a trait, you need some way to instantiate it, so let's create a MyLogger object:

object ConsoleLogger extends LoggerImplementation {
       def publishLog(level: String, message: String) = {
            println(s"level: $level, message: $message")
}

object Implicits {
       implicit lazy val consoleLogger = ConsoleLogger
}
}

This provides an interface, and you have also provided an implementation as an object. Since it is an object, you don't have to worry about instantiating ConsoleLogger, you can just import it. Now let's define a logger trait:

trait Logger {
    def log(level: String, message: String)(implicit loggerImplementation:
      LoggerImplementation) = {
        loggerImplementation.publishLog(level,message)
}
}

Now let's create a basic Main function to test out logger:

import ConsoleLogger.Implicits._

object Main extends App with Logger {
      log("info","This is a log statement"
}

In this case, ConsoleLogger represents a module (i.e., an implementation of a Logger). If you remove the ConsoleLogger.Implicits. _ import, you get a compile error saying something similar to the following:

error: could not find implicit value for parameter loggerImplementation:
  this.LoggerImplementation

There is also a notion of companion objects, which is an object that also references a class of the same name and path. In this sense, companion objects work the exact same way that normal objects do, but the difference is that a class can access the private members of a companion object directly.

object MyClass {
private val statement = "This is a statement"
}

case class MyClass (additionalStatement: String) {
def printStatement = println(s"$additionalStatement ${MyClass.statement}")
}

As with sealed traits, companion objects have to be defined in the same file that their companion class is defined in. Factory and instantiation related code is commonly placed within companion objects.

Scala also offers another feature called package objects. Package objects are similar to packages described earlier; however, they don't have limitations regarding structures, which normally can't be top level (such as implicit classes, implicit conversions, or type aliases). Normally, you would have to place these constructs inside an object and then manually import them. As an example, you may have some type alias like:

package mypackage
object InternalTypeAliais {
       type Number = BigDecimal
       type Price = BigDecimal
}

To use these types, you then have to explicitly import the InternalTypeAliases. An alternative is to use a package object, and you can define it like so.

package mypackage
package object internalTypeAliases {
type Number = BigDecimal
       type Price = BigDecimal
}

Now, anything inside of mypackage automatically has reference to the content defined inside of internalTypeAliases. This feature also means that package objects are commonly used inside a package, such as util functions, implicit conversions, and factory ­methods without having to deal with massive lists of import statements.

Scala, also similar to Java, provides various access control modifiers that allow designers to restrict how code is accessed (the only real exception is that in Scala, everything is public unless specified otherwise). You may have noticed earlier the keyword private, which essentially means that the value can only be accessed within the object itself (and a companion object if it's defined). The other access modifier is protected, which means the value can only be accessed within the class and any of its subclasses.

object Internals {
       private val privateInt = 1 //Can only be accessed within Internals, or
         a companion case class Internals if it's defined
       protected val protectedInt = 5 //Can only be accessed within Internals, or
        a companion case class Internals, or any subclass of Internals
}
object MoreInternals extends Internals {
       val publicInt = protectedInt + 10 // Is public
       val publicInt2 = privateInt + 5 // Does not compile
}

object Main {
       print(Internals.privateInt) // Does not compile
       print(MoreInternals.publicInt)
}

The usage of access modifiers is important in library and API design, and they are an important principle in encapsulation. Combined with Scala's already powerful features regarding modules, it's possible to provide both extensibility and restriction as desired.

AnyVal, AnyRef, Any, and the Type Hierarchy

Unlike languages like Java and C, Scala makes a specific distinction between value's (also known as primitive types) and references in the type system generically. This allows you to specify your own custom values (in a limited fashion).

An AnyVal in Scala represents a value that is not boxed, i.e., they are represented as actual values. Common types that inherit AnyVal include number types (Int, Long, Double, and Float) as well as other types like Boolean. Since the memory representation of these types is often very small, it's much more efficient to store just the actual value in the host system, rather than a reference to the value.

Since Scala 2.10, you can define your own AnyVal types by extending the AnyVal class. Previously in the implicit section of this chapter we defined an ApiKey class, and the definition is repeated below.

case class ApiKey(id: String)

If you want to turn it into a value type, simply make it extend AnyVal:

case class ApiKey(id: String) extends AnyVal

Now whenever ApiKey is instantiated, you won't get a performance penalty due to boxing, yet you still have the benefits of treating ApiKey as a different type. You can write functions that take ApiKey, instead of having to deal with String.

There are limitations when it comes down to using AnyVal (http://docs.scala-lang.org/overviews/core/value-classes.html#limitations), and these limitations are mainly due to the underlying host (in this case, the JVM).

As opposed to AnyVal, Scala also has the concept of AnyRef, which represents a reference to an object. Essentially any instantiated variable that isn't an AnyVal has to be an AnyRef (either directly, or indirectly by the type hierarchy). One notable difference with AnyRefs is how to treat both equality and identity. Since AnyRef stores a reference to either a value or another object, rather than the actual value itself, there are different ways to treat equality. Typically, languages such as C and Java use reference equality to deal with the comparison of non-primitive types, which often means methods have to be separately defined to deal with equality by its contents (also known as deep or structural equality).

Similarly, methods often need to be defined to allow an efficient representation of the reference as a value (this is known as hashCode in Java).

In Scala, as in other functional languages such as ML and Haskell, deep equality is used by default for comparison with structures such as case classes, rather than just comparing whether the two objects have the same reference. The same also applies for hashCode.

case class Example(s: String)

Example("test") == Example("test") // returns true

val a = Example("test")
val b = Example("test")

a == b // Also returns true

class Example2(s:String)

val c = new Example2("test")
val d = new Example2("test")

c == d // Returns false

Finally, you have the Any type, which basically denotes that the type can be either a reference or a value. Any is the supertype of every other type in Scala (that is, everything can be of type Any). This in stark contrast to Java, which although it has an Object type, it doesn't have a type to represent values.

SUMMARY

As you can see in this introductory chapter, Scala is a language that has quite a few orthogonal features, which when combined together, provide a highly extensible language that is able to provide expressive and rich libraries, while also being largely correct.

The advanced type system, combined with implicit parameters and subtyping, allows you to apply type safety to very complex business logic, giving the ability for the Scala compiler to detect errors before they get pushed into production. The type system also provides a powerful form of documentation, allowing you to get an initial overview of a library, as well as the powerful and accurate type completion in IDEs, that types provides. Advanced usage of types will be looked at in greater detail in Chapter 9.

Scala also provides the necessary tools to improve performance without too much sacrifice in expressiveness and abstractions. AnyValvs AnyRef is an example of such a feature. Case classes, case objects, and sealed traits set up the basis required to model GADTs, an elegant solution to model data structures and ASTs. Pattern matching, essentially a souped-up switch statement whose power is unmatched in many modern languages, gives you a unified solution to many problems, such as deconstructing data structures as well as safe runtime type casts.

A comprehensive collection library provides a vast array of both mutable and immutable data structures with a common interface to maximize reusability, as well as consistency of methods and functions used when calling typical collection methods. It also provides a transformation between different data structures. A functional design underpins the collection methods, which pave the way for the basis of functional programming, which is explored in Chapter 2, and much more advanced material in Chapter 10.

Finally, the explicit control that Scala gives you over both runtime and static modularization of code provides a principal way to approach many issues that are applicable in modern and large scale systems, including, but not limited to, dependency injection and loosely coupled modules. The SBT build tool (explained in greater detail in Chapters 4 and 12) allows you to structure, segment and control how your code is loaded and injected. The SBT tool also excels at supporting the creation of artifacts and the deploying of binaries.

The combination of modularity and functional concepts form the base design of Scala as a language, that is, “Unifying functional and object-oriented programming,” which is a direct quote from Martin Ordersky, the creator of the Scala programming language.

While this chapter has gone over many of these features to give a general overview, Scala itself has an almost boundless ability to express code in the most desirable fashion. Due to the huge breadth that is available in the Scala language, the later chapters in this book go into greater depth for a smaller range of essential topics, to help pave the way for you to enhance your programming experience, plus a more fundamental basis for Scala knowledge.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.123.155