Overview Day 3#

Welcome to day 3โœจ!

We will start our day with a number of lectures covering topics from multi-animal pose estimation, via real-time experiments to the model zoo ๐Ÿ˜Ž.

In the practical part, we will keep improving our DeepLabCut models from yesterday. If youโ€™re in tracking multiple animals, go to practical part for multiple animals.

Then you can also work with the Model Zoo and DLC-live (try to reserve at least 1h for this).

Day 3: Take home messages#

Tip

  • What is active learning?

  • How do you get a good pose tracking model?

Day 3: Major goals โšฝ๏ธ#

Important

  • Letโ€™s try to have some videos well analyzed (i.e. good pose estimation prediction). Then we will be able to dig into kinematics or supervised and unsupervised behavioral analysis!

How can you check if the videos are well analyzed ๐Ÿค”?

  • Create labeled videos ๐Ÿ“ฝ and watch them to see if the predictions on new videos are accurate. You can use: deeplabcut.create_labeled_video.

  • visualize the trajectories over time, and check if the trajectories are smooth, which is normally the case for biological motion, unless, e.g., your frame rate is very small. You can use: deeplabcut.plot_trajectories, check out the figure below to get an idea of what is the output.

For both features, refer to the user guide ๐Ÿ“• for details.

Letโ€™s go back ๐Ÿ”™.