Spark REST API几乎提供了您所要求的一切。
一些例子;
目前内存中有多少百分比的 RDD?
GET /api/v1/applications/[app-id]/storage/rdd/0
将回复:
{
"id" : 0,
"name" : "ParallelCollectionRDD",
"numPartitions" : 2,
"numCachedPartitions" : 2,
"storageLevel" : "Memory Deserialized 1x Replicated",
"memoryUsed" : 28000032,
"diskUsed" : 0,
"dataDistribution" : [ {
"address" : "localhost:54984",
"memoryUsed" : 28000032,
"memoryRemaining" : 527755733,
"diskUsed" : 0
} ],
"partitions" : [ {
"blockName" : "rdd_0_0",
"storageLevel" : "Memory Deserialized 1x Replicated",
"memoryUsed" : 14000016,
"diskUsed" : 0,
"executors" : [ "localhost:54984" ]
}, {
"blockName" : "rdd_0_1",
"storageLevel" : "Memory Deserialized 1x Replicated",
"memoryUsed" : 14000016,
"diskUsed" : 0,
"executors" : [ "localhost:54984" ]
} ]
}
计算 RDD 所需的总时间?
计算 RDD 也称为作业、阶段或尝试。
GET /applications/[app-id]/stages/[stage-id]/[stage-attempt-id]/taskSummary
将回复:
{
"quantiles" : [ 0.05, 0.25, 0.5, 0.75, 0.95 ],
"executorDeserializeTime" : [ 2.0, 2.0, 2.0, 2.0, 2.0 ],
"executorRunTime" : [ 3.0, 3.0, 4.0, 4.0, 4.0 ],
"resultSize" : [ 1457.0, 1457.0, 1457.0, 1457.0, 1457.0 ],
"jvmGcTime" : [ 0.0, 0.0, 0.0, 0.0, 0.0 ],
"resultSerializationTime" : [ 0.0, 0.0, 0.0, 0.0, 0.0 ],
"memoryBytesSpilled" : [ 0.0, 0.0, 0.0, 0.0, 0.0 ],
"diskBytesSpilled" : [ 0.0, 0.0, 0.0, 0.0, 0.0 ],
"shuffleReadMetrics" : {
"readBytes" : [ 340.0, 340.0, 342.0, 342.0, 342.0 ],
"readRecords" : [ 10.0, 10.0, 10.0, 10.0, 10.0 ],
"remoteBlocksFetched" : [ 0.0, 0.0, 0.0, 0.0, 0.0 ],
"localBlocksFetched" : [ 2.0, 2.0, 2.0, 2.0, 2.0 ],
"fetchWaitTime" : [ 0.0, 0.0, 0.0, 0.0, 0.0 ],
"remoteBytesRead" : [ 0.0, 0.0, 0.0, 0.0, 0.0 ],
"totalBlocksFetched" : [ 2.0, 2.0, 2.0, 2.0, 2.0 ]
}
}
你的问题太笼统了,我就不一一回复了。我相信 spark 必须反映的一切都反映在 REST API 中。