-
Notifications
You must be signed in to change notification settings - Fork 58
Description
Great work!
But when projecting the end effector onto a 2D image, I consistently observe some bias in certain examples. I suspect that the camera's extrinsic parameters are incorrect.
First, I read the camera intrinsics from the SVO file and the cartesian_position and camera extrinsics from trajectory.h5. I am certain that the issue is not caused by an inverse matrix. I used the following code to annotate the video:
#droid_raw/1.0.1/AUTOLab/success/2023-07-08/Sat_Jul__8_08:57:28_2023
...
extrinsic=np.eye(4)
extrinsic[:3,:3]=Rotation.from_euler("xyz", extrinsic1[3:]).as_matrix()
extrinsic[:3, 3] = extrinsic1[:3]
extrinsic=np.linalg.inv(extrinsic)
camera_cor=np.einsum("ij,nj->ni",extrinsic,traj)[:,:3]
image_cor=np.einsum("ij,nj->ni",intrinsic1,camera_cor)
image_cor=image_cor[:,:2]/image_cor[:,2:]
cap=...
width=int(cap.get(cv2.CAP_PROP_FRAME_WIDTH))
height=int(cap.get(cv2.CAP_PROP_FRAME_HEIGHT))
out=FFmpegVideoWriter("test.mp4",30,width,height)
idx=0
while cap.isOpened():
ret,frame=cap.read()
if not ret:
break
frame=frame[:,:width,:]
frame=cv2.circle(frame,(int(image_cor[idx,0]),int(image_cor[idx,1])),5,(0,0,255),-1)
out.write(frame)
idx+=1
out.release()
cap.release()The resulting video shows that the annotated points are synchronized with the robotic arm's motion, but there is some spatial bias.
test.mp4
Furthermore, I annotated the orientation of the cartesian_position coordinates (droid_raw/1.0.1/AUTOLab/success/2023-07-08/Sat_Jul__8_08:57:28_2023):
trajectory_ext2_cam_left.mp4
trajectory_ext1_cam_left.mp4
I also visualized the dataset (droid_raw/1.0.1/AUTOLab/success/2023-07-08/Sat_Jul__8_08:57:28_2023) using rerun-io/python-example-droid-dataset: Visualizing the DROID dataset using Rerun (github.com). When I hover the mouse over the 3D robotic arm, the corresponding point in the 2D camera view is clearly incorrect.
