Spark-shell execution

I am doing following steps I wanted to know why the counter value is not increased in the for loop iteration?

var accessLogs = sc.textFile("/data/spark/project/access/access.log.45.gz")
accessLogs.count //24857
var counter = 0 // counter:Int=0
for(i<-accessLogs) counter = counter + 1 // shows counter as 0 insted of 24857              
for(i<-accessLogs) println(i) // prints line upto 24857 

Let us not use for loop over an rdd. Instead use transformations and actions.