0%

Functional Programming in Kotlin teaches you how to design and write Kotlin applications using typed functional programming. Offering clear examples, carefully-presented explanations, and extensive exercises, it moves from basic subjects like types and data structures to advanced topics such as stream processing. This book is based on the bestseller Functional Programming in Scala by Rúnar Bjarnason and Paul Chiusano.

Table of Contents

  1. Functional Programming with Kotlin
  2. Copyright
  3. contents
  4. front matter
    1. foreword
    2. preface
    3. acknowledgments
    4. about this book
    5. Who should read this book
    6. How this book is organized
    7. How to read this book
    8. About the code
    9. liveBook discussion forum
  5. Part 1. Introduction to functional programming
  6. 1 What is functional programming?
    1. 1.1 The benefits of FP: A simple example
    2. 1.1.1 A program with side effects
    3. 1.1.2 A functional solution: Removing the side effects
    4. 1.2 Exactly what is a (pure) function?
    5. 1.3 RT, purity, and the substitution model
    6. 1.4 What lies ahead
    7. Summary
  7. 2 Getting started with functional programming in Kotlin
    1. 2.1 Higher-order functions: Passing functions to functions
    2. 2.1.1 A short detour: Writing loops functionally
    3. 2.1.2 Writing our first higher-order function
    4. 2.2 Polymorphic functions: Abstracting over types
    5. 2.2.1 An example of a polymorphic function
    6. 2.2.2 Calling HOFs with anonymous functions
    7. 2.3 Following types to implementations
    8. Summary
  8. 3 Functional data structures
    1. 3.1 Defining functional data structures
    2. 3.2 Working with functional data structures
    3. 3.2.1 The “when” construct for matching by type
    4. 3.2.2 The when construct as an alternative to if-else logic
    5. 3.2.3 Pattern matching and how it differs from Kotlin matching
    6. 3.3 Data sharing in functional data structures
    7. 3.3.1 The efficiency of data sharing
    8. 3.4 Recursion over lists and generalizing to HOFs
    9. 3.4.1 More functions for working with lists
    10. 3.4.2 Lists in the Kotlin standard library
    11. 3.4.3 Inefficiency of assembling list functions from simpler components
    12. 3.5 Trees
    13. Summary
  9. 4 Handling errors without exceptions
    1. 4.1 The problems with throwing exceptions
    2. 4.2 Problematic alternatives to exceptions
    3. 4.2.1 Sentinel value
    4. 4.2.2 Supplied default value
    5. 4.3 Encoding success conditions with Option
    6. 4.3.1 Usage patterns for Option
    7. 4.3.2 Option composition, lifting, and wrapping exception-oriented APIs
    8. 4.3.3 For-comprehensions with Option
    9. 4.4 Encoding success and failure conditions with Either
    10. 4.4.1 For-comprehensions with Either
    11. Summary
  10. 5 Strictness and laziness
    1. 5.1 Strict and non-strict functions
    2. 5.2 An extended example: Lazy lists
    3. 5.2.1 Memoizing streams and avoiding recomputation
    4. 5.2.2 Helper functions for inspecting streams
    5. 5.3 Separating program description from evaluation
    6. 5.4 Producing infinite data streams through corecursive functions
    7. 5.5 Conclusion
    8. Summary
  11. 6 Purely functional state
    1. 6.1 Generating random numbers using side effects
    2. 6.2 Purely functional random number generation
    3. 6.3 Making stateful APIs pure
    4. 6.4 An implicit approach to passing state actions
    5. 6.4.1 More power by combining state actions
    6. 6.4.2 Recursive retries through nested state actions
    7. 6.4.3 Applying the combinator API to the initial example
    8. 6.5 A general state action data type
    9. 6.6 Purely functional imperative programming
    10. 6.7 Conclusion
    11. Summary
  12. Part 2. Functional design and combinator libraries
  13. 7 Purely functional parallelism
    1. 7.1 Choosing data types and functions
    2. 7.1.1 A data type for parallel computations
    3. 7.1.2 Combining parallel computations to ensure concurrency
    4. 7.1.3 Marking computations to be forked explicitly
    5. 7.2 Picking a representation
    6. 7.3 Refining the API with the end user in mind
    7. 7.4 Reasoning about the API in terms of algebraic equations
    8. 7.4.1 The law of mapping
    9. 7.4.2 The law of forking
    10. 7.4.3 Using actors for a non-blocking implementation
    11. 7.5 Refining combinators to their most general form
    12. Summary
  14. 8 Property-based testing
    1. 8.1 A brief tour of property-based testing
    2. 8.2 Choosing data types and functions
    3. 8.2.1 Gathering initial snippets for a possible API
    4. 8.2.2 Exploring the meaning and API of properties
    5. 8.2.3 Discovering the meaning and API of generators
    6. 8.2.4 Generators that depend on generated values
    7. 8.2.5 Refining the property data type
    8. 8.3 Test case minimization
    9. 8.4 Using the library and improving the user experience
    10. 8.4.1 Some simple examples
    11. 8.4.2 Writing a test suite for parallel computations
    12. 8.5 Generating higher-order functions and other possibilities
    13. 8.6 The laws of generators
    14. 8.7 Conclusion
    15. Summary
  15. 9 Parser combinators
    1. 9.1 Designing an algebra
    2. 9.1.1 A parser to recognize single characters
    3. 9.1.2 A parser to recognize entire strings
    4. 9.1.3 A parser to recognize repetition
    5. 9.2 One possible approach to designing an algebra
    6. 9.2.1 Counting character repetition
    7. 9.2.2 Slicing and nonempty repetition
    8. 9.3 Handling context sensitivity
    9. 9.4 Writing a JSON parser
    10. 9.4.1 Defining expectations of a JSON parser
    11. 9.4.2 Reviewing the JSON format
    12. 9.4.3 A JSON parser
    13. 9.5 Surfacing errors through reporting
    14. 9.5.1 First attempt at representing errors
    15. 9.5.2 Accumulating errors through error nesting
    16. 9.5.3 Controlling branching and backtracking
    17. 9.6 Implementing the algebra
    18. 9.6.1 Building up the algebra implementation gradually
    19. 9.6.2 Sequencing parsers after each other
    20. 9.6.3 Capturing error messages through labeling parsers
    21. 9.6.4 Recovering from error conditions and backtracking over them
    22. 9.6.5 Propagating state through context-sensitive parsers
    23. 9.7 Conclusion
    24. Summary
  16. Part 3. Common structures in functional design
  17. 10 Monoids
    1. 10.1 What is a monoid?
    2. 10.2 Folding lists with monoids
    3. 10.3 Associativity and parallelism
    4. 10.4 Example: Parallel parsing
    5. 10.5 Foldable data structures
    6. 10.6 Composing monoids
    7. 10.6.1 Assembling more complex monoids
    8. 10.6.2 Using composed monoids to fuse traversals
    9. Summary
  18. 11 Monads and functors
    1. 11.1 Functors
    2. 11.1.1 Defining the functor by generalizing the map function
    3. 11.1.2 The importance of laws and their relation to the functor
    4. 11.2 Monads: Generalizing the flatMap and unit functions
    5. 11.2.1 Introducing the Monad interface
    6. 11.3 Monadic combinators
    7. 11.4 Monad laws
    8. 11.4.1 The associative law
    9. 11.4.2 Proving the associative law for a specific monad
    10. 11.4.3 The left and right identity laws
    11. 11.5 Just what is a monad?
    12. 11.5.1 The identity monad
    13. 11.5.2 The State monad and partial type application
    14. Summary
  19. 12 Applicative and traversable functors
    1. 12.1 Generalizing monads for reusability
    2. 12.2 Applicatives as an alternative abstraction to the monad
    3. 12.3 The difference between monads and applicative functors
    4. 12.3.1 The Option applicative vs. the Option monad
    5. 12.3.2 The Parser applicative vs. the Parser monad
    6. 12.4 The advantages of applicative functors
    7. 12.4.1 Not all applicative functors are monads
    8. 12.5 Reasoning about programs through the applicative laws
    9. 12.5.1 Laws of left and right identity
    10. 12.5.2 Law of associativity
    11. 12.5.3 Law of naturality
    12. 12.6 Abstracting traverse and sequence using traversable functors
    13. 12.7 Using Traversable to iteratively transform higher kinds
    14. 12.7.1 From monoids to applicative functors
    15. 12.7.2 Traversing collections while propagating state actions
    16. 12.7.3 Combining traversable structures
    17. 12.7.4 Traversal fusion for single pass efficiency
    18. 12.7.5 Simultaneous traversal of nested traversable structures
    19. 12.7.6 Pitfalls and workarounds for monad composition
    20. Summary
  20. Part 4. Effects and I/O
  21. 13 External effects and I/O
    1. 13.1 Factoring effects out of an effectful program
    2. 13.2 Introducing the IO type to separate effectful code
    3. 13.2.1 Handling input effects
    4. 13.2.2 Benefits and drawbacks of the simple IO type
    5. 13.3 Avoiding stack overflow errors by reification and trampolining
    6. 13.3.1 Reifying control flow as data constructors
    7. 13.3.2 Trampolining: A general solution to stack overflow
    8. 13.4 A more nuanced IO type
    9. 13.4.1 Reasonably priced monads
    10. 13.4.2 A monad that supports only console I/O
    11. 13.4.3 Testing console I/O by using pure interpreters
    12. 13.5 Non-blocking and asynchronous I/O
    13. 13.6 A general-purpose IO type
    14. 13.6.1 The main program at the end of the universe
    15. 13.7 Why the IO type is insufficient for streaming I/O
    16. Summary
  22. 14 Local effects and mutable state
    1. 14.1 State mutation is legal in pure functional code
    2. 14.2 A data type to enforce scoping of side effects
    3. 14.2.1 A domain-specific language for scoped mutation
    4. 14.2.2 An algebra of mutable references
    5. 14.2.3 Running mutable state actions
    6. 14.2.4 The mutable array represented as a data type for the ST monad
    7. 14.2.5 A purely functional in-place quicksort
    8. 14.3 Purity is contextual
    9. 14.3.1 Definition by example
    10. 14.3.2 What counts as a side effect?
    11. Summary
  23. 15 Stream processing and incremental I/O
    1. 15.1 Problems with imperative I/O: An example
    2. 15.2 Transforming streams with simple transducers
    3. 15.2.1 Combinators for building stream transducers
    4. 15.2.2 Combining multiple transducers by appending and composing
    5. 15.2.3 Stream transducers for file processing
    6. 15.3 An extensible process type for protocol parameterization
    7. 15.3.1 Sources for stream emission
    8. 15.3.2 Ensuring resource safety in stream transducers
    9. 15.3.3 Applying transducers to a single-input stream
    10. 15.3.4 Multiple input streams
    11. 15.3.5 Sinks for output processing
    12. 15.3.6 Hiding effects in effectful channels
    13. 15.3.7 Dynamic resource allocation
    14. 15.4 Application of stream transducers in the real world
    15. Summary
    16. Final words
  24. Appendix A. Exercise hints and tips
    1. A.1 Chapter 3: Functional data structures
    2. Exercise 3.1
    3. Exercise 3.2
    4. Exercise 3.3
    5. Exercise 3.4
    6. Exercise 3.5
    7. Exercise 3.6
    8. Exercise 3.7
    9. Exercise 3.12
    10. Exercise 3.14
    11. Exercise 3.15
    12. Exercise 3.16
    13. Exercise 3.17
    14. Exercise 3.18
    15. Exercise 3.19
    16. Exercise 3.23
    17. Exercise 3.28
    18. A.2 Chapter 4: Handling errors without exceptions
    19. Exercise 4.3
    20. Exercise 4.4
    21. Exercise 4.5
    22. Exercise 4.6
    23. Exercise 4.7
    24. Exercise 4.8
    25. A.3 Chapter 5: Strictness and laziness
    26. Exercise 5.1
    27. Exercise 5.2
    28. Exercise 5.4
    29. Exercise 5.6
    30. Exercise 5.9
    31. Exercise 5.10
    32. Exercise 5.11
    33. Exercise 5.14
    34. Exercise 5.15
    35. Exercise 5.16
    36. A.4 Chapter 6: Purely functional state
    37. Exercise 6.2
    38. Exercise 6.5
    39. Exercise 6.6
    40. Exercise 6.7
    41. Exercise 6.8
    42. Exercise 6.9
    43. Exercise 6.10
    44. A.5 Chapter 7: Purely functional parallelism
    45. Exercise 7.1
    46. Exercise 7.2
    47. Exercise 7.3
    48. Exercise 7.5
    49. Exercise 7.7
    50. A.6 Chapter 8: Property-based testing
    51. Exercise 8.4
    52. Exercise 8.5
    53. Exercise 8.6
    54. Exercise 8.9
    55. Exercise 8.12
    56. Exercise 8.13
    57. Exercise 8.16
    58. A.7 Chapter 9: Parser combinators
    59. Exercise 9.1
    60. Exercise 9.2
    61. Exercise 9.7
    62. Exercise 9.9
    63. Exercise 9.10
    64. Exercise 9.12
    65. A.8 Chapter 10: Monoids
    66. Exercise 10.2
    67. Exercise 10.3
    68. Exercise 10.4
    69. Exercise 10.5
    70. Exercise 10.6
    71. Exercise 10.7
    72. Exercise 10.8
    73. Exercise 10.9
    74. Exercise 10.13
    75. Exercise 10.19
    76. A.9 Chapter 11: Monads and functors
    77. Exercise 11.1
    78. Exercise 11.2
    79. Exercise 11.3
    80. Exercise 11.4
    81. Exercise 11.6
    82. Exercise 11.7
    83. Exercise 11.8
    84. Exercise 11.9
    85. Exercise 11.10
    86. Exercise 11.13
    87. Exercise 11.16
    88. Exercise 11.18
    89. Exercise 11.19
    90. A.10 Chapter 12: Applicative and traversable functors
    91. Exercise 12.2
    92. Exercise 12.3
    93. Exercise 12.5
    94. Exercise 12.6
    95. Exercise 12.7
    96. Exercise 12.8
    97. Exercise 12.9
    98. Exercise 12.10
    99. Exercise 12.11
    100. Exercise 12.12
    101. Exercise 12.13
    102. Exercise 12.15
    103. Exercise 12.16
    104. Exercise 12.17
    105. Exercise 12.18
    106. Exercise 12.19
    107. A.11 Chapter 13: External effects and I/O
    108. Exercise 13.1
    109. Exercise 13.2
    110. Exercise 13.4
    111. A.12 Chapter 14: Local effects and mutable state
    112. Exercise 14.1
    113. Exercise 14.2
    114. Exercise 14.3
    115. A.13 Chapter 15: Stream processing and incremental I/O
    116. Exercise 15.3
    117. Exercise 15.5
    118. Exercise 15.6
    119. Exercise 15.7
    120. Exercise 15.9
    121. Exercise 15.10
    122. Exercise 15.11
    123. Exercise 15.12
  25. Appendix B. Exercise solutions
    1. B.1 Before you proceed to the solutions
    2. B.2 Getting started with functional programming
    3. Exercise 2.1
    4. Exercise 2.2
    5. Exercise 2.3
    6. Exercise 2.4
    7. Exercise 2.5
    8. B.3 Functional data structures
    9. Exercise 3.1
    10. Exercise 3.2
    11. Exercise 3.3
    12. Exercise 3.4
    13. Exercise 3.5
    14. Exercise 3.6
    15. Exercise 3.7
    16. Exercise 3.8
    17. Exercise 3.9
    18. Exercise 3.10
    19. Exercise 3.11
    20. Exercise 3.12 (Hard)
    21. Exercise 3.13
    22. Exercise 3.14 (Hard)
    23. Exercise 3.15
    24. Exercise 3.16
    25. Exercise 3.17
    26. Exercise 3.18
    27. Exercise 3.19
    28. Exercise 3.20
    29. Exercise 3.21
    30. Exercise 3.22
    31. Exercise 3.23
    32. Exercise 3.24
    33. Exercise 3.25
    34. Exercise 3.26
    35. Exercise 3.27
    36. Exercise 3.28
    37. B.4 Handling errors without exceptions
    38. Exercise 4.1
    39. Exercise 4.2
    40. Exercise 4.3
    41. Exercise 4.4
    42. Exercise 4.5
    43. Exercise 4.6
    44. Exercise 4.7
    45. Exercise 4.8
    46. B.5 Strictness and laziness
    47. Exercise 5.1
    48. Exercise 5.2
    49. Exercise 5.3
    50. Exercise 5.4
    51. Exercise 5.5
    52. Exercise 5.6 (Hard)
    53. Exercise 5.7
    54. Exercise 5.8
    55. Exercise 5.9
    56. Exercise 5.10
    57. Exercise 5.11
    58. Exercise 5.12
    59. Exercise 5.13
    60. Exercise 5.14 (Hard)
    61. Exercise 5.15
    62. Exercise 5.16 (Hard)
    63. B.6 Purely functional state
    64. Exercise 6.1
    65. Exercise 6.2
    66. Exercise 6.3
    67. Exercise 6.4
    68. Exercise 6.5
    69. Exercise 6.6
    70. Exercise 6.7
    71. Exercise 6.8
    72. Exercise 6.9
    73. Exercise 6.10
    74. Exercise 6.11
    75. B.7 Purely functional parallelism
    76. Exercise 7.1
    77. Exercise 7.2
    78. Exercise 7.3
    79. Exercise 7.4
    80. Exercise 7.5
    81. Exercise 7.6
    82. Exercise 7.7 (Hard)
    83. Exercise 7.8 (Hard)
    84. Exercise 7.9 (Hard/Optional)
    85. Exercise 7.10
    86. Exercise 7.11
    87. Exercise 7.12
    88. Exercise 7.13
    89. B.8 Property-based testing
    90. Exercise 8.1
    91. Exercise 8.2
    92. Exercise 8.3
    93. Exercise 8.4
    94. Exercise 8.5
    95. Exercise 8.6
    96. Exercise 8.7
    97. Exercise 8.8
    98. Exercise 8.9
    99. Exercise 8.10
    100. Exercise 8.11
    101. Exercise 8.12
    102. Exercise 8.13
    103. Exercise 8.14
    104. Exercise 8.15
    105. Exercise 8.16
    106. Exercise 8.17
    107. B.9 Parser combinators
    108. Exercise 9.1
    109. Exercise 9.2 (Hard)
    110. Exercise 9.3
    111. Exercise 9.4
    112. Exercise 9.5
    113. Exercise 9.6
    114. Exercise 9.7
    115. Exercise 9.8
    116. Exercise 9.9 (Hard)
    117. Exercise 9.10
    118. Exercise 9.11
    119. Exercise 9.12
    120. Exercise 9.13
    121. Exercise 9.14
    122. B.10 Monoids
    123. Exercise 10.1
    124. Exercise 10.2
    125. Exercise 10.3
    126. Exercise 10.4
    127. Exercise 10.5
    128. Exercise 10.6
    129. Exercise 10.7
    130. Exercise 10.8 (Hard/Optional)
    131. Exercise 10.9 (Hard/Optional)
    132. Exercise 10.10
    133. Exercise 10.11
    134. Exercise 10.12
    135. Exercise 10.13
    136. Exercise 10.14
    137. Exercise 10.15
    138. Exercise 10.16
    139. Exercise 10.17
    140. Exercise 10.18
    141. Exercise 10.19
    142. B.11 Monads and functors
    143. Exercise 11.1
    144. Exercise 11.2
    145. Exercise 11.3
    146. Exercise 11.4
    147. Exercise 11.5
    148. Exercise 11.6 (Hard)
    149. Exercise 11.7
    150. Exercise 11.8 (Hard)
    151. Exercise 11.9
    152. Exercise 11.10
    153. Exercise 11.11
    154. Exercise 11.12
    155. Exercise 11.13 (Hard/Optional)
    156. Exercise 11.14 (Hard/Optional)
    157. Exercise 11.15 (Hard/Optional)
    158. Exercise 11.16
    159. Exercise 11.17
    160. Exercise 11.18
    161. Exercise 11.19 (Hard)
    162. B.12 Applicative and traversable functors
    163. Exercise 12.1
    164. Exercise 12.2 (Hard)
    165. Exercise 12.3
    166. Exercise 12.4
    167. Exercise 12.5
    168. Exercise 12.6
    169. Exercise 12.7 (Hard)
    170. Exercise 12.8
    171. Exercise 12.9
    172. Exercise 12.10
    173. Exercise 12.11
    174. Exercise 12.12 (Hard)
    175. Exercise 12.13 (Hard)
    176. Exercise 12.14
    177. Exercise 12.15
    178. Exercise 12.16
    179. Exercise 12.17
    180. Exercise 12.18 (Hard)
    181. Exercise 12.19 (Hard/Optional)
    182. B.13 External effects and I/O
    183. Exercise 13.1
    184. Exercise 13.2
    185. Exercise 13.3 (Hard)
    186. Exercise 13.4 (Hard/Optional)
    187. B.14 Local effects and mutable state
    188. Exercise 14.1
    189. Exercise 14.2
    190. Exercise 14.3
    191. B.15 Stream processing and incremental I/O
    192. Exercise 15.1
    193. Exercise 15.2
    194. Exercise 15.3
    195. Exercise 15.4
    196. Exercise 15.5 (Hard)
    197. Exercise 15.6
    198. Exercise 15.7 (Optional)
    199. Exercise 15.8
    200. Exercise 15.9 (Optional)
    201. Exercise 15.10
    202. Exercise 15.11
    203. Exercise 15.12
  26. Appendix C. Higher-kinded types
    1. C.1 A compiler workaround
    2. C.2 Partially applied type constructors
    3. C.3 Boilerplate code generation with Arrow Meta
  27. Appendix D. Type classes
    1. D.1 Polymorphism
    2. D.2 Using type classes to express ad hoc polymorphism
    3. D.3 Type classes foster a separation of concerns
  28. index
18.116.36.192