我已经扫描并训练了多个真实世界的对象。我确实有,ARReferenceObject
并且该应用程序可以很好地检测到它们。
我面临的问题是,当一个对象没有明显的、充满活力的特征时,它需要几秒钟才能返回一个我能理解的检测结果。现在,我希望应用程序在尝试检测对象时在对象顶部显示一个边界框和一个活动指示器。
我没有看到任何有关此的信息。此外,如果有任何方法可以获得检测开始的时间或被检测对象的置信百分比。
任何帮助表示赞赏。
可以显示boundingBox
关于ARReferenceObject
它被检测到的先验;尽管我不确定您为什么要这样做(无论如何都要提前)。
例如,假设您的 referenceObject 在水平面上,您首先需要将估计的边界框放置在平面上(或使用其他方法提前放置),并在检测 ARPlaneAnchor 并放置boundingBox 很可能您的模型已经被检测到。
可能的方法:
毫无疑问,您知道ARReferenceObject
一个center
,extent
和scale
属性以及一组rawFeaturePoints
与对象关联的。
因此,我们可以根据 Apple 在Scanning & Detecting 3D Objects中的一些示例代码创建自己的 boundingBox 节点,并创建我们自己的 SCNNode 它将显示一个近似大小的边界框,该边界框在ARReferenceObject
被检测到之前存储在本地.
请注意,您需要从 Apple 示例代码中找到“wireframe_shader”,以使 boundingBox 呈现透明:
import Foundation
import ARKit
import SceneKit
class BlackMirrorzBoundingBox: SCNNode {
//-----------------------
// MARK: - Initialization
//-----------------------
/// Creates A WireFrame Bounding Box From The Data Retrieved From The ARReferenceObject
///
/// - Parameters:
/// - points: [float3]
/// - scale: CGFloat
/// - color: UIColor
init(points: [float3], scale: CGFloat, color: UIColor = .cyan) {
super.init()
var localMin = float3(Float.greatestFiniteMagnitude)
var localMax = float3(-Float.greatestFiniteMagnitude)
for point in points {
localMin = min(localMin, point)
localMax = max(localMax, point)
}
self.simdPosition += (localMax + localMin) / 2
let extent = localMax - localMin
let wireFrame = SCNNode()
let box = SCNBox(width: CGFloat(extent.x), height: CGFloat(extent.y), length: CGFloat(extent.z), chamferRadius: 0)
box.firstMaterial?.diffuse.contents = color
box.firstMaterial?.isDoubleSided = true
wireFrame.geometry = box
setupShaderOnGeometry(box)
self.addChildNode(wireFrame)
}
required init?(coder aDecoder: NSCoder) { fatalError("init(coder:) Has Not Been Implemented") }
//----------------
// MARK: - Shaders
//----------------
/// Sets A Shader To Render The Cube As A Wireframe
///
/// - Parameter geometry: SCNBox
func setupShaderOnGeometry(_ geometry: SCNBox) {
guard let path = Bundle.main.path(forResource: "wireframe_shader", ofType: "metal", inDirectory: "art.scnassets"),
let shader = try? String(contentsOfFile: path, encoding: .utf8) else {
return
}
geometry.firstMaterial?.shaderModifiers = [.surface: shader]
}
}
要显示边界框,您将执行以下操作,注意在我的示例中,我有以下变量:
@IBOutlet var augmentedRealityView: ARSCNView!
let configuration = ARWorldTrackingConfiguration()
let augmentedRealitySession = ARSession()
要在检测到实际对象本身之前显示 boundingBox,您可以调用func
loadBoundigBox
例如viewDidLoad
:
/// Creates A Bounding Box From The Data Available From The ARObject In The Local Bundle
func loadBoundingBox(){
//1. Run Our Session
augmentedRealityView.session = augmentedRealitySession
augmentedRealityView.delegate = self
//2. Load A Single ARReferenceObject From The Main Bundle
if let objectURL = Bundle.main.url(forResource: "fox", withExtension: ".arobject"){
do{
var referenceObjects = [ARReferenceObject]()
let object = try ARReferenceObject(archiveURL: objectURL)
//3. Log it's Properties
print("""
Object Center = \(object.center)
Object Extent = \(object.extent)
Object Scale = \(object.scale)
""")
//4. Get It's Scale
let scale = CGFloat(object.scale.x)
//5. Create A Bounding Box
let boundingBoxNode = BlackMirrorzBoundingBox(points: object.rawFeaturePoints.points, scale: scale)
//6. Add It To The ARSCNView
self.augmentedRealityView.scene.rootNode.addChildNode(boundingBoxNode)
//7. Position It 0.5m Away From The Camera
boundingBoxNode.position = SCNVector3(0, -0.5, -0.5)
//8. Add It To The Configuration
referenceObjects.append(object)
configuration.detectionObjects = Set(referenceObjects)
}catch{
print(error)
}
}
//9. Run The Session
augmentedRealitySession.run(configuration, options: [.resetTracking, .removeExistingAnchors])
augmentedRealityView.automaticallyUpdatesLighting = true
}
上面的示例简单地从未检测到的边界框创建了一个边界ARReferenceObject
框,并将其放置在距边界 0.5 米和距边界 0.5 米的位置,Camera
从而产生如下内容:
当然,您最初需要处理 boundBox 的位置,以及处理 boundingBox 'indicator' 的移除。
下面的方法只是在检测到实际对象时显示一个boundBox,例如:
//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------
extension ViewController: ARSCNViewDelegate{
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
//1. Check We Have A Valid ARObject Anchor
guard let objectAnchor = anchor as? ARObjectAnchor else { return }
//2. Create A Bounding Box Around Our Object
let scale = CGFloat(objectAnchor.referenceObject.scale.x)
let boundingBoxNode = BlackMirrorzBoundingBox(points: objectAnchor.referenceObject.rawFeaturePoints.points, scale: scale)
node.addChildNode(boundingBoxNode)
}
}
这会产生这样的结果:
关于检测计时器,Apple 示例代码中有一个示例,显示了检测模型所需的时间。
以最原始的形式(不考虑毫秒),您可以执行以下操作:
首先创建 ATimer
和 avar
来存储检测时间,例如:
var detectionTimer = Timer()
var detectionTime: Int = 0
然后,当您运行ARSessionConfiguration
初始化计时器时,例如:
/// Starts The Detection Timer
func startDetectionTimer(){
detectionTimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(logDetectionTime), userInfo: nil, repeats: true)
}
/// Increments The Total Detection Time Before The ARReference Object Is Detected
@objc func logDetectionTime(){
detectionTime += 1
}
然后,当ARReferenceObject
检测到使计时器无效并记录时间时,例如:
//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------
extension ViewController: ARSCNViewDelegate{
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
//1. Check We Have A Valid ARObject Anchor
guard let _ = anchor as? ARObjectAnchor else { return }
//2. Stop The Timer
detectionTimer.invalidate()
//3. Log The Detection Time
print("Total Detection Time = \(detectionTime) Seconds")
//4. Reset The Detection Time
detectionTime = 0
}
}
这应该足以让你开始......
请注意,此示例在扫描对象时不提供边界框(请查看 Apple 示例代码),它提供了基于您的问题中隐含的现有 ARReferenceObject 的边界框(假设我解释正确)。