Machine learning withspark

please explain the below command in the MACHINE LEARNING WITH SPARK MOVIE LENS EXAMPLE

val als = new
ALS().setMaxIter(5).setRegParam(0.01).setUserCol(“userId”).setItemCol(“movieId”).setRatingCol(“rating”)

and

predictions.map(r => r(2).asInstanceOf[Float] - r(4).asInstanceOf[Float])
.map(x => x*x)
.filter(!.isNaN)
.reduce(
+ _)

Let me break into parts:

Creating an instance of the ALS - Alternating Least Squares algorithm:
new ALS()

Setting various parameters on the algorithm:

setMaxIter(5)
.setRegParam(0.01)

Telling the algorithm which column in data corresponds to user, item and rating:
.setUserCol(“userId”).setItemCol(“movieId”).setRatingCol(“rating”)

Calculating the difference of actual verses prediced observations:
predictions.map(r => r(2).asInstanceOf[Float] - r(4).asInstanceOf[Float])

Calculating the square of differences computed in the previous step:

.map(x => x*x)

Removing the Not-a-number or invalid values:
.filter(! *.isNaN)*

Computing the sum of all values:
.reduce(_ + _)

In summary, it is calculating the sum of square of differences between actual vs prediced values.