WHen am trying to execute datanodeInfo in my Java program am getting error

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Access denied for user anusharj4128. Superuser privilege is required
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkSuperuserPrivilege(FSPermissionChecker.java:122)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkSuperuserPrivilege(FSNamesystem.java:6342)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.datanodeReport(FSNamesystem.java:5312)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDatanodeReport(NameNodeRpcServer.java:1164)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDatanodeReport(ClientNamenodeProtocolServerSideTranslatorPB.java:721)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)

The below is the code I used .Need DatanodeInfo in my program.

DatanodeInfo[] datanodInfoList = dfs.getDataNodeStats();
for (int i = 0; i < datanodInfoList.length; i++) {
DatanodeInfo datanode = datanodInfoList[i];
System.out.println("datanode Report : " + datanode.getDatanodeReport());
System.out.println(“datanode Host :” + datanode.getHostName());
System.out.println(“datanode Info Port :” + datanode.getInfoPort());
System.out.println(“datanode Network Location :” + datanode.getNetworkLocation());
System.out.println(“datanode Parent :” + datanode.getParent());
System.out.println(“datanode DFS Used :” + datanode.getDfsUsed());
System.out.println(“datanode Capacity :” + datanode.getCapacity());
System.out.println(“datanode Level :” + datanode.getLevel());
System.out.println(“datanode Remaining :” + datanode.getRemaining());
}