Spatial Alignment of Captured Inertial Measurement Unit Trajectory and 2D Video For Golf Swing Analysis

Maani; Rouzbeh ;   et al.

Patent Application Summary

U.S. patent application number 15/243341 was filed with the patent office on 2018-02-22 for spatial alignment of captured inertial measurement unit trajectory and 2d video for golf swing analysis. The applicant listed for this patent is Seiko Epson Corporation. Invention is credited to Michael Belshaw, Irina Kezele, Rouzbeh Maani, Jie Wang.

Application Number20180050254 15/243341
Document ID /
Family ID61191036
Filed Date2018-02-22

United States Patent Application 20180050254
Kind Code A1
Maani; Rouzbeh ;   et al. February 22, 2018

Spatial Alignment of Captured Inertial Measurement Unit Trajectory and 2D Video For Golf Swing Analysis

Abstract

A method for spatial alignment of golf club inertial measurement data and two-dimensional video for golf club swing analysis is provided. The method includes capturing inertial measurement data of a golf club swing through an inertial measurement unit (IMU), and sending the inertial measurement data from the inertial measurement unit to a computing device. The computing device is configured to align a first axis of the inertial measurement data to a first axis of further inertial measurement data from a mobile device, estimate translation of the inertial measurement data to two-dimensional video captured by the mobile device, align second and third axes of the inertial measurement data of the golf club swing to second and third axes of the two-dimensional video, and overlay a projected golf club trajectory based on the aligned captured inertial measurement data onto the two-dimensional video.


Inventors: Maani; Rouzbeh; (Toronto, CA) ; Wang; Jie; (Markham, CA) ; Belshaw; Michael; (Richmond Hill, CA) ; Kezele; Irina; (Richmond Hill, CA)
Applicant:
Name City State Country Type

Seiko Epson Corporation

Tokyo

JP
Family ID: 61191036
Appl. No.: 15/243341
Filed: August 22, 2016

Current U.S. Class: 1/1
Current CPC Class: G06K 9/00342 20130101; G06K 9/00523 20130101
International Class: A63B 69/36 20060101 A63B069/36

Claims



1. A method for spatial alignment of golf club inertial measurement data and two-dimensional video for golf club swing analysis, comprising: capturing inertial measurement data of a golf club swing through an inertial measurement unit (IMU); and sending the inertial measurement data of the golf club swing from the inertial measurement unit to a computing device, so that the computing device aligns a first axis of the inertial measurement data of the golf club swing to a first axis of further inertial measurement data from a mobile device, estimates translation of the inertial measurement data of the golf club swing to two-dimensional video of the golf club swing captured by the mobile device, aligns second and third axes of the inertial measurement data of the golf club swing to second and third axes of the two-dimensional video, and overlays a projected golf club trajectory based on the aligned captured inertial measurement data of the golf club swing onto the two-dimensional video.

2. The method of claim 1, further comprising: attaching a marker to the IMU, so that the two-dimensional video captures the golf club swing and the marker, wherein alignment of the second and third axes of the inertial measurement data of the golf club swing to second and third axes of the two-dimensional video is based on detecting the marker in video frames.

3. The method of claim 1, further comprising: activating a flashing light attached to the IMU, so that the two-dimensional video captures the golf club swing and the flashing light, wherein alignment of second and third axes of the inertial measurement data of the golf club swing to second and third axes of the two-dimensional video is based on detecting the flashing light in video frames.

4. The method of claim 1, wherein: the first axis is a z axis, wherein the z axis is oriented according to gravity; the second axis is an x axis, wherein the x axis is oriented according to a ball running direction; and the third axis is a y axis, wherein the y axis is orthogonal to the x axis and the z axis.

5. The method of claim 1, wherein: an origin of a coordinate system for the inertial measurement data of the golf club swing includes a location of a golf ball prior to or at moment of impact of a golf club head to the golf ball.

6. The method of claim 1, wherein: the computing device solves pitch and roll angle by first axis alignment; and the computing device solves yaw angle by second and third axis alignment.

7. A method for spatial alignment of golf-club inertial measurement data and two-dimensional video for golf club swing analysis, performed by a computing device, comprising: receiving captured inertial measurement data of a golf club swing from an inertial measurement unit (IMU) attached to a golf club; receiving, from a device having a video camera, a two-dimensional video of the golf club swing; determining translation of the inertial measurement data of the golf club swing to the two-dimensional video of the golf club swing; aligning first and second axes of the inertial measurement data of the golf club swing to first and second axes of the two-dimensional video; and overlaying a projected golf club trajectory onto the two-dimensional video, based on alignment of the inertial measurement data of the golf club swing and the two-dimensional video.

8. The method of claim 7, wherein the device having the video camera is a mobile device, and further comprising: receiving, from the mobile device, inertial measurement data of the mobile device in association with the two-dimensional video; and aligning a third axis of the inertial measurement data of the golf club swing to a third axis of the inertial measurement data of the mobile device.

9. The method of claim 7, wherein the device having the video camera is a mobile device, and further comprising: sending a video having the overlaid projected golf club trajectory on the two-dimensional video of the golf club swing, to the mobile device for display on the mobile device.

10. The method of claim 7, further comprising: detecting a marker on a floor in the two-dimensional video; and calculating rotation and translation from marker coordinate space to video camera coordinate space, wherein the determining translation is based on a measured distance between a golf ball and an origin of the marker or detecting the golf ball in a two-dimensional video frame and estimating a size of the golf ball.

11. The method of claim 7, wherein the aligning first and second axes comprises maximizing a similarity between a re-projected captured golf club line based on the captured inertial measurement data of the golf club swing and a detected golf club line from a plurality of two-dimensional video frames.

12. The method of claim 7, wherein the aligning first and second axes comprises minimizing an error between a re-projected captured golf club line based on the captured inertial measurement data of the golf club swing and a detected sensor location of the IMU from a plurality of two-dimensional video frames.

13. The method of claim 7, wherein the receiving the captured inertial measurement data and the receiving the two-dimensional video are by wireless connection.

14. A tangible, non-transitory, computer-readable media having instructions thereupon which, when executed by a processor, cause the processor to perform a method comprising: receiving, from an inertial measurement unit (IMU) attached to a golf club, inertial measurement data of a golf club swing; receiving, from a mobile device, a two-dimensional video of the golf club swing in association with inertial measurement data relative to the mobile device; aligning a first axis of the inertial measurement data of the golf club swing to a first axis of the inertial measurement data relative to the mobile device; determining translation of the inertial measurement data of the golf club swing to the two-dimensional video of the golf club swing; aligning second and third axes of the inertial measurement data of the golf club swing to second and third axes of the two-dimensional video; and overlaying a projected golf club swing trajectory onto the two-dimensional video, based on alignment of the inertial measurement data of the golf club swing and the two-dimensional video.

15. The computer-readable media of claim 14, wherein the method further comprises: reading gravity sensor readings from the mobile device, wherein the aligning the first axis of the inertial measurement data of the golf club swing to the first axis of the inertial measurement data relative to the mobile device comprises aligning gravity direction captured by an IMU of the mobile device and gravity direction captured by the IMU attached to the golf club.

16. The computer-readable media of claim 14, wherein the method further comprises: displaying a video of the overlaid projected golf club swing trajectory on the two-dimensional video; receiving, from user input, an identification of a location of the IMU attached to the golf club in a video frame; re-aligning the second and third axes of the inertial measurement data of the golf club swing to the second and third axes of the two-dimensional video, based on the identification of the location in the video frame; and overlaying a revised projected golf swing trajectory onto the two-dimensional video.

17. The computer-readable media of claim 14, wherein the method further comprises: forming a background model, based on averaging video frames; subtracting the background model from a plurality of video frames; detecting edges in the plurality of video frames with the background model subtracted; and selecting a longest line from among the edges, to detect a golf club in the plurality of video frames.

18. The computer-readable media of claim 14, wherein the method further comprises: detecting a golf ball in frames in the two-dimensional video, based on searching for a circular pattern; and eliminating one or more false positives, based on symmetry of intensity and gradient direction of illumination of the golf ball.

19. The computer-readable media of claim 14, wherein the determining translation is based upon detecting a golf ball in one or more frames of the two-dimensional video and estimating a size of the golf ball in the one or more frames of the two-dimensional video.

20. The computer-readable media of claim 14, wherein method further comprises: transforming the inertial measurement data of the golf club swing from a coordinate system of the IMU attached to the golf club to a coordinate system of a camera of the mobile device, based on the aligning the first axis, the translation, and the aligning the second and third axes, wherein the overlaying is based on the transforming.

21. A tangible, non-transitory, computer-readable media having instructions thereupon which, when executed by a processor of a mobile device, cause the processor to perform a method comprising: receiving, from an inertial measurement unit (IMU) attached to a golf club, inertial measurement data of a golf club swing; capturing, on the mobile device, a two-dimensional video of the golf club swing in association with inertial measurement data of the mobile device; aligning z axis of the inertial measurement data of the golf club swing to z axis of the inertial measurement data of the mobile device; determining translation of the inertial measurement data of the golf club swing to the two-dimensional video of the golf club swing; aligning x and y axes of the inertial measurement data of the golf club swing to x and y axes of the two-dimensional video; overlaying a projected golf club swing trajectory onto the two-dimensional video, based on alignment of the inertial measurement data of the golf club swing and the two-dimensional video; and displaying, on the mobile device, the overlaid projected golf club swing trajectory on the two-dimensional video.
Description



BACKGROUND

[0001] As an increasingly popular sport, golf has attracted millions of people around the world. Athletes and amateurs are always looking for ways to improve their skills. Sensor based golf coaching systems are commercially available. One such system provides an IMU (inertial measurement unit) sensor, denoted as M-Tracer.TM., on the golf club. The sensor tracks the golf club and outputs a high frequency swing trajectory as well as many other metrics such as impact speed, shaft angle etc. Although the sensor based golf coaching systems provide useful information, it is still difficult for a normal user to understand the information and link that information to his or her performance. It is within this context that the embodiments arise.

SUMMARY

[0002] In some embodiments, a method for spatial alignment of golf club inertial measurement data and two-dimensional video for golf club swing analysis is provided. The method includes capturing inertial measurement data of a golf club swing through an inertial measurement unit (IMU), and sending the inertial measurement data of the golf club swing from the inertial measurement unit to a computing device. The computing device is configured to align a first axis of the inertial measurement data of the golf club swing to a first axis of further inertial measurement data from a mobile device, estimate translation of the inertial measurement data of the golf club swing to two-dimensional video of the golf club swing captured by the mobile device, align second and third axes of the inertial measurement data of the golf club swing to second and third axes of the two-dimensional video, and overlay a projected golf club trajectory based on the aligned captured inertial measurement data of the golf club swing onto the two-dimensional video.

[0003] In some embodiments, a method for spatial alignment of golf-club inertial measurement data and two-dimensional video for golf club swing analysis, performed by a computing device is provided. The method includes receiving captured inertial measurement data of a golf club swing from an inertial measurement unit (IMU) attached to a golf club, and receiving, from a device having a video camera, a two-dimensional video of the golf club swing. The method includes determining translation of the inertial measurement data of the golf club swing to the two-dimensional video of the golf club swing. The method includes aligning first and second axes of the inertial measurement data of the golf club swing to first and second axes of the two-dimensional video, and overlaying a projected golf club trajectory onto the two-dimensional video, based on alignment of the inertial measurement data of the golf club swing and the two-dimensional video.

[0004] In some embodiments, a tangible, non-transitory, computer-readable media having instructions thereupon which, when executed by a processor, cause the processor to perform a method. The method includes receiving, from an inertial measurement unit (IMU) attached to a golf club, inertial measurement data of a golf club swing and receiving, from a mobile device, a two-dimensional video of the golf club swing in association with inertial measurement data relative to the mobile device. The method includes aligning a first axis of the inertial measurement data of the golf club swing to a first axis of the inertial measurement data relative to the mobile device, and determining translation of the inertial measurement data of the golf club swing to the two-dimensional video of the golf club swing. The method includes aligning second and third axes of the inertial measurement data of the golf club swing to second and third axes of the two-dimensional video, and overlaying a projected golf club swing trajectory onto the two-dimensional video, based on alignment of the inertial measurement data of the golf club swing and the two-dimensional video.

[0005] Other aspects and advantages of the embodiments will become apparent from the following detailed description taken in conjunction with the accompanying drawings which illustrate, by way of example, the principles of the described embodiments.

BRIEF DESCRIPTION OF THE DRAWINGS

[0006] The described embodiments and the advantages thereof may best be understood by reference to the following description taken in conjunction with the accompanying drawings. These drawings in no way limit any changes in form and detail that may be made to the described embodiments by one skilled in the art without departing from the spirit and scope of the described embodiments.

[0007] FIG. 1 depicts an inertial measurement unit (IMU) captured golf swing trajectory overlaid on a video, in a golf coaching system in accordance with some embodiments.

[0008] FIG. 2 is a flow diagram of a method for overlaying an IMU captured golf swing trajectory onto a two-dimensional video in accordance with some embodiments.

[0009] FIG. 3A depicts a coordinate system for an IMU, relative to gravity and a ball running direction in accordance with some embodiments.

[0010] FIG. 3B is a front view of a cell phone camera coordinate system in accordance with some embodiments.

[0011] FIG. 3C is a front view of a cell phone IMU coordinate system in accordance with some embodiments.

[0012] FIG. 4 is a flow diagram of a method to align z-axis of IMU coordinate space and camera coordinate space in accordance with some embodiments.

[0013] FIG. 5 is a flow diagram of a method to estimate translation between IMU coordinate space and camera coordinate space in accordance with some embodiments.

[0014] FIG. 6 depicts positioning a golf club straight up and down for a distance estimation from a frontal camera view in accordance with some embodiments.

[0015] FIG. 7 is a flow diagram of a method to align x and y axes of IMU coordinate space and camera coordinate space in accordance with some embodiments.

[0016] FIG. 8 is a flow diagram of a method for user assisted mode in accordance with some embodiments.

[0017] FIGS. 9A and 9B show examples of user assisted mode: clicking on a location of the golf ball (left) in a video frame and clicking on a location of the IMU (right) in a video frame in accordance with some embodiments.

[0018] FIG. 10 depicts a marker coordinate system in accordance with some embodiments.

[0019] FIG. 11 shows measurements of a ball location in marker coordinate space in accordance with some embodiments.

[0020] FIGS. 12A and 12B show symmetry of intensity (left) and symmetry of gradient direction (right) in images of golf balls in accordance with some embodiments.

[0021] FIGS. 13A and 13B depict using a distance transform to find the center of the ball in accordance with some embodiments.

[0022] FIG. 14 depicts fitting a semicircle to the ball in accordance with some embodiments.

[0023] FIGS. 15A and 15B depict intensity profiles of lines through images of golf balls in accordance with some embodiments.

[0024] FIG. 16 is a flow diagram of a method for determining a time bias between a video frame and an inertial measurement unit sample, for synchronizing in accordance with some embodiments.

[0025] FIGS. 17A-C show an example of false impact frame detection in accordance with some embodiments.

[0026] FIG. 18 depicts a temporal pattern of intensity at the ball location, in video frames in accordance with some embodiments.

[0027] FIG. 19 depicts temporal alignment by changing the time offset of the inertial measurement unit samples or frames relative to video frames in accordance with some embodiments.

[0028] FIG. 20 is a block diagram of a golf coaching system in accordance with the present disclosure in accordance with some embodiments.

[0029] FIG. 21 is a flow diagram of a method for spatial alignment of golf-club inertial measurement data and two-dimensional video for golf club swing analysis in accordance with some embodiments.

[0030] FIG. 22 is an illustration showing an exemplary computing device which may implement the embodiments described herein.

DETAILED DESCRIPTION

[0031] A golf coaching system for golf swing analysis performs spatial and temporal alignment of an inertial measurement unit (IMU) captured golf swing and a two-dimensional video of the golf swing, using an apparatus and various methods described herein. The methods can be performed on one or more processors, such as a processor of an IMU, a processor of a computing device and/or a processor of a mobile device (which could also be a computing device). One device that is suitable for performing portions of various methods and serving as a portion of a suitable apparatus is the M-Tracer.TM. of the assignee, which is an IMU that can be mounted to a golf club. The M-Tracer.TM. is equipped with wireless communication, and can send IMU data to another wireless device. Although embodiments are described herein using the M-Tracer.TM. as an IMU in one embodiment, it should be appreciated that variations and further embodiments are readily devised using other IMU systems, as the embodiments are not limited to the M-Tracer.TM. product.

[0032] FIG. 1 depicts an inertial measurement unit (IMU) captured golf swing trajectory overlaid on a video, in a golf coaching system. The system allows a user to directly see a visualization of the trajectory on top of his golf swing video as captured by a mobile device such as a smart phone or a tablet as shown in FIG. 1. A projected golf club swing trajectory 101 is overlaid onto a two-dimensional video 103 of the golf club swing, and displayed on a view screen of a mobile device, in this example. Since the M-Tracer.TM. system, or any other suitable IMU system, and the camera system have different coordinate system, a method is needed to spatially align these coordinate systems. In this disclosure a method to automatically calibrate the IMU system and the camera system, spatially and temporally, is described. After calibration, the capture trajectory can be overlaid on top of the video.

[0033] With reference to FIGS. 1-15 and 20, a method and apparatus to spatially align the IMU captured golf swing trajectory with 2D (two-dimensional) video captured by a static mobile device such as a smart phone or tablet are disclosed. After alignment, the user is able to visualize his or her swing trajectory on top of the 2D video. The method can automatically estimate the rotation and translation between camera coordinate space and IMU system coordinate space, with the following steps: [0034] (1) Read gravity sensor readings from the mobile device during static time. [0035] (2) Align z-axis by aligning the gravity direction captured by the mobile device and the IMU system, e.g., M-Tracer.TM.. [0036] (3) Estimate translation by detecting the golf ball in a 2D video frame and estimating the size of the golf ball. [0037] (4) Align x-/y-axes by either (a) maximizing the similarity between a re-projected IMU system captured golf club line and a detected golf club line from multiple 2D video frames or (b) minimizing the error between the re-projected IMU system captured IMU system sensor location and the detected IMU system sensor location from multiple 2D video frames. The location of the IMU system can be detected by attaching a marker on the sensor surface or detecting an LED (light emitting diode) or other flashing or non-flashing light positioned on the sensor, through analysis of the video frames. [0038] (5) User assisted mode, in some embodiments, can be activated to improve cases that have inaccurate output, with minimal user assistance.

[0039] Also disclosed herein is a marker based method to spatially align the IMU system captured golf swing trajectory with a 2D video captured by a static regular camera that does not necessarily come with an embedded IMU sensor. The method has the following steps: [0040] (1) Put a marker on the floor. [0041] (2) Perform marker detection and calculate rotation and translation from marker coordinate space to camera coordinate space. [0042] (3) Estimate translation from IMU system coordinate space to marker coordinate space by measuring the distance between the golf ball and a marker origin. In some embodiments, as an alternative, the translation estimation from IMU system coordinate space to camera coordinate space can be done directly by detecting the golf ball in a 2D video frame and estimating the size of the golf ball. [0043] (4) Align x-/y-axes between marker space and IMU system space by either (a) maximizing the similarity between a re-projected IMU system captured golf club line and a detected club line from multiple 2D video frames or (b) minimizing the error between the re-projected, IMU system captured, IMU system location and the detected IMU system location from multiple 2D video frames. The location of the IMU system can be detected in video frames by detecting a further marker attached on the sensor surface or detecting the LED flashing light on the sensor. [0044] (5) User assisted mode, in some embodiments, can be activated to improve cases that have inaccurate output, with minimal user assistance. The methods are detailed in the following sections.

[0045] In US patent publication 2015/0072797, a system to overlay an IMU sensor captured golf swing trace onto a 2D video is provided. The method spatially aligns these two systems by requiring the golfer to stand at a predefined position. The system renders a human model on a preview screen and the golfer adjusts his or her position such that his or her posture aligns with the model. The human model is calculated based on information regarding the camera direction, such as front, back, etc., under the assumption that the golf player and camera are either parallel or perpendicular. The distance between the camera and golf player is calculated based on the information of the golf player's height and some pre-set parameters estimated from the Japanese population. It should be appreciated that assumption of camera position will not always be guaranteed and the pre-learned parameter may not be accurate for each case. If there exists a tilt in the camera or the distance between camera and player is inaccurate, the overlay of sensor captured swing trajectory will deviate from the actual swing motion from the video. In such a case, a manual adjustment has to be implemented through a user interface.

[0046] Present embodiments, however, provide a fully automatic solution to align the IMU system signal with video without requiring any additional human interaction, except for the optional user assisted mode in some embodiments. The golfer can stand at any position without pre-adjusting positions and the system does not require the camera to be positioned in parallel or perpendicular to the golf player. The method calibrates IMU system coordinate space and camera coordinate space by using IMU information from the smart phone and detecting image features, such as using a detected golf club line to align x-y plane rotation and using a determined golf ball size to determine distance between camera and golf player.

[0047] The embodiments align the IMU system trajectory with a 2D golf swing video by estimating the transformation from the IMU system coordinate system to the camera coordinate system, including three rotation angles (pitch .theta., roll .phi. and yaw .psi.) and three translation values in x, y and z axes. FIG. 2 is a flow diagram of a method for overlaying an IMU captured golf swing trajectory onto a two-dimensional video.

[0048] The system takes the following as input: (1) IMU Reading: gravity sensor reading when the mobile device is static, (2) IMU system trajectory, (3) RGB (red, green, blue, i.e. color, or in further embodiments black and white) video from a mobile device, (4) camera intrinsics, and (5) temporal alignment information represented as time bias between the first frame of IMU system reading and the first RGB video frame. The temporal alignment is assumed to be solved by some method. A method of doing temporal alignment is provided in the present disclosure with reference to FIGS. 1 and 16-20, after the method of spatial alignment. Aspects of the methods disclosed herein can be performed by various modules as described below, in various embodiments. The modules can be implemented in software executing on a processor, hardware, firmware or combinations thereof, in a computing device or mobile device. In one embodiment, a pipeline includes these modules: [0049] (1) Z-axis alignment 202 to solve pitch and roll angle. This module or process receives an IMU reading 212, for example from an IMU associated with a camera, and the captured M-Tracer.TM. trajectory 214, from the IMU attached to the golf club, and outputs z axis alignment information to the overlay 210 module or process. [0050] (2) Translation estimation 204 by detecting the golf ball and estimating its size. This module or process receives the M-Tracer.TM. trajectory 214 of the golf club swing, RGB video 212 (e.g., the two-dimensional video of the golf club swing), camera intrinsics 214, and time bias 216, and outputs the translation matrix to the x y axis alignment 206 module or process and the overlay 210 module or process. Included in the translation estimation 204 are golf ball detection 220 and golf ball size estimation 222. [0051] (3) X-Y-axis alignment 206 by detecting and tracking golf clubs or detecting and tracking M-Tracer.TM. sensor to solve yaw angle. This module or process receives the translation matrix from the translation estimation 204 module or process, and outputs x y axis alignment information to the overlay 210 module or process. Included in the x y axis alignment 206 is golf club/M-Tracer.TM. detection and tracking 224. [0052] (4) User assisted mode 208. This mode can be activated to improve the cases which have inaccurate output. This receives overlay information from the overlay 210 module or process, and user input 218, and outputs information to the translation estimation 204 module or process useful in golf ball detection 220. [0053] (5) Overlay 210, to overlay the projected M-Tracer.TM. trajectory on the video. This module or process receives z axis alignment information 202 from z axis alignment 202, the translation matrix from translation estimation 204, x y axis alignment information from x y axis alignment 206, the M-Tracer.TM. trajectory 214, the RGB video 212, the camera intrinsics 214, and the time bias 216. Overlay 210, in some embodiments, interacts with user assisted mode 208. Output of overlay 210 is the overlaid video 226, which is also used by translation estimation 204 for refinement of the translation matrix.

[0054] Notations and Assumptions used herein are described below.

Let [x.sub.T, y.sub.T, z.sub.T].sup.T denote a point represented in M-Tracer.TM., or other suitable IMU system, coordinate space and [x.sub.c, y.sub.c, z.sub.c].sup.T denote the same point represented in camera space, thus

[x.sub.c,y.sub.c,z.sub.c].sup.T=R.sub.MC[x.sub.T,y.sub.T,z.sub.T].sup.T+- T.sub.MC, (Eq. 1)

where R.sub.MC and T.sub.MC denote the rotation and translation from IMU system coordinate system to camera coordinate system.

[0055] Any 3D rotation can be represented by three Euler angles: pitch (.theta.), roll (.phi.) and yaw (.psi.) which are defined as the rotation angle around x, y and z axis accordingly. Therefore, R.sub.MC can be further decomposed as

R.sub.MC=R(.theta.,.phi.)R(.psi.) (Eq. 2)

[0056] In the following sections, the method to estimateR(.theta., .phi.), R(.psi.) and T.sub.MC is detailed. After obtaining R.sub.MC and T.sub.MC, IMU system trajectory can be transformed to camera coordinate system and its projection onto the 2D image plane [u, v] can be calculated accordingly using camera intrinsic parameters, i.e.,

u = f x x c z c + c x ( Eq . 3 ) v = f y y c z c + c y ( Eq . 4 ) ##EQU00001##

where f.sub.x, f.sub.y denote the focal length measured in width and height of pixels and [c.sub.x, c.sub.y] denote the principal point coordinate.

[0057] There are some assumptions made by the proposed algorithm as follows:

(1) Camera is equipped with an IMU sensor that can measure the gravity direction relative to camera IMU coordinates. (2) Camera is mounted to a fixed position and kept static during the golf club swing. (3) During address time (i.e., the moment of golf club impact to the golf ball), golf club position is close to the golf ball.

[0058] Details of the Coordinate Systems are provided below. In order for the system to determine the transformation from the IMU system coordinate system to the camera coordinate system, three body coordinates are involved: IMU system CS (coordinate system), cell phone IMU CS and cell phone camera CS. These body coordinates are defined further below, in some embodiments. Note that in various embodiments, the coordinate system is defined as follows, but the algorithm does not require these specific coordinate systems, and conversions to other coordinate systems are readily devised so that different coordinate systems may be integrated with the embodiments.

[0059] FIG. 3A depicts a coordinate system for an IMU, relative to gravity and a ball running direction. The origin (0, 0, 0) of the golf club attached IMU coordinate system is at the golf club head position when the golf club head hits the ball, and is assumed the same as the ball position. The z axis 302 is defined as aligned with gravity, and the positive values along the z axis extending upward. The x axis 304 is defined as aligning with the ball running direction, with the positive values along the x axis extending forward in the direction the ball travels initially after the address (i.e. moment of impact). The y axis 306 is defined as orthogonal to the x and z axes.

[0060] FIG. 3B is a front view of a cell phone camera coordinate system. The origin of the cell phone camera coordinate system could be located anywhere in the frame of the video, for example at a corner of the screen or frame, in the middle of the screen or frame, or in an upper region of the frame as depicted. The y axis 308 is pointing downward relative to the video frame. The x axis 310 is pointing rightward relative to the video frame. The z axis 312 is pointing upward perpendicular to the video frame, orthogonal to the x and y axes.

[0061] FIG. 3C is a front view of a cell phone IMU coordinate system. The origin of the cell phone IMU coordinate system could be at the location of the cell phone IMU, or elsewhere relative to the cell phone. The y axis 314 is pointing upward, parallel to the main body of the cell phone. The x axis 316 is pointing rightward relative to the main body of the cell phone. The z axis 318 is perpendicular to the main body of the cell phone and is pointing upward relative to the screen or face of the cell phone.

[0062] FIG. 4 is a flow diagram of a method to align z axis of IMU coordinate space and camera coordinate space. In an action 402, gravity sensor values are read from a mobile device. In an action 404, a gravity-up vector in camera space is calculated according to the equation 5. In an action 406, a gravity-up vector is read from the IMU system. In an action 408, pitch and roll angles are calculated according to equations 8 and 9. This method can be performed by Module 1:Z Axis Alignment (R(.theta., .phi.)).

[0063] Let [g.sub.xi, g.sub.yc, g.sub.zc].sup.T denote the unit gravity_up vector represented in IMU coordinate system that can be obtained by normalizing the reading from gravity sensor, where gravity_up denotes the gravity direction pointing upward. Let [g.sub.xc, g.sub.yi, g.sub.zc].sup.T denote the corresponding representation in camera coordinate system. According to FIGS. 4 and 5, it is observed that

[g.sub.xc,g.sub.yc,g.sub.zc].sup.T (Eq. 5)

where R.sub.j2c denotes the rotation from IMU coordinate space to camera coordinate space, which can be obtained through a IMU-Camera calibration process such as the method of Lobo and Dias published in 2007. In most smart phones and tablets, the IMU chip is installed such that the three axes are aligned with camera axes as illustrated in FIGS. 3B and 3C, thus R.sub.j2c can be reasonably

simplified as R i 2 c = [ 1 0 0 0 - 1 0 0 0 - 1 ] . ##EQU00002##

[0064] Let [g.sub.xT, g.sub.yT, g.sub.zT].sup.T denote the unit gravity_up vector represented in the M-Tracer.TM. coordinate system. Aligning [g.sub.xc, g.sub.yc, g.sub.zc].sup.T with [g.sub.xT, g.sub.yT, g.sub.zT].sup.T is equivalent to find theR(.theta.,.phi.), such that

[g.sub.xc,g.sub.yc,g.sub.zc].sup.T=R(.theta.,.phi.)[g.sub.xT,g.sub.yT,g.- sub.zT].sup.T (Eq. 6)

[0065] As depicted in FIG. 3A, in the current work, gravity_up aligns with z-axis in M-Tracer.TM. coordinate system, i.e., [g.sub.xT, g.sub.yT, g.sub.zT].sup.T=[0, 0, 1].sup.T. R(.theta., .phi.) can be expressed as

R ( .theta. , .phi. ) = [ 1 0 0 0 cos .phi. sin .phi. 0 - sin .phi. cos .phi. ] [ cos .theta. 0 - sin .theta. 0 1 0 sin .theta. 0 cos .theta. ] [ cos .psi. sin .psi. 0 - sin .psi. cos .psi. 0 0 0 1 ] ( Eq . 7 ) ##EQU00003##

where R(.theta., .phi.).sub.13=-sin .theta., R(.theta., .phi.).sub.23=sin .phi. cos .theta., R(.theta., .phi.).sub.33=cos .phi. cos .theta.. Thus roll and pitch angle can be solved as

.phi.=a tan 2(g.sub.yc,g.sub.zc) (Eq. 8)

.theta.=a tan(-g.sub.xc,(g.sub.yc*sin .phi.+g.sub.zc*cos .phi.)) (Eq. 9)

Thus by setting .psi. as an arbitrary value to be optimized later, e.g., .psi.=0,R(.theta.,.phi.) can be calculated accordingly.

[0066] FIG. 5 is a flow diagram of a method to estimate translation between IMU coordinate space and camera coordinate space. In an action 502, the ball (i.e., golf ball) is detected from the two-dimensional video frame. In an action 504, the ball size is detected from the two-dimensional video frame. In an action 506, the translation is estimated according to equation 11. This results in a transformation matrix.

[0067] This method can be performed by Module 2: Calculating Translation. As shown in FIG. 3A, the origin of the IMU system coordinate system is the location of the golf ball (and same as the club head position when impact happens). Therefore, its translation to the camera center is equivalent to the 3D location of the ball ([X.sub.ball, Y.sub.ball, Z.sub.ball]) in the camera coordinate system.

T.sub.MC=[X.sub.ball,Y.sub.ball,Z.sub.ball].sup.T (Eq. 10)

[0068] Let u.sub.ball, V.sub.ball and rad.sub.ball be the location and radius of the detected golf ball in a 2D image. Embodiments for the detection of ball location and ball size can be found later in the disclosure, with reference to FIGS. 8-15. X.sub.ball, Y.sub.ball, Z.sub.ball are found by the following formula assuming a pin-hole camera model

X ball = ( u ball - c x ) f x Z ball , Y ball = ( v ball - c y ) f y Z ball , Z ball = f rad ball R ball , ( Eq . 11 ) ##EQU00004##

where f.sub.x, and f.sub.y denote focal length, c.sub.x, and c.sub.y the principle point of the camera that obtained from camera intrinsics and R.sub.ball is the priori known actual golf ball size in 3D.

[0069] It should be noted, if in another system, the origin of the M-Tracer.TM. (or other IMU system) coordinate system is not the golf ball position, it is easy to convert to the presently defined coordinate system by subtracting the whole IMU system trajectory by the value of the known position of club head position at impact time. It should also be noted that other objects around the ball (which have the same distance from camera) with known size could be used to estimate Z.sub.ball. For instance: [0070] Golf club: since the size of the golf club is know it could be used for distance estimation. For side view this estimation has no problem, but, for other angles (e.g. frontal view) the golf club should be positioned straight as shown in FIG. 6.

[0070] Z ball = f Length club i n 2 D image Length club i n real world ##EQU00005## [0071] Human height: the height of the person who swings is entered into the application and the person stands straight.

[0071] Z ball = f Height humen i n 2 D image Height human i n real world ##EQU00006##

[0072] FIG. 6 depicts positioning a golf club 602 straight up and down for a distance estimation from a frontal camera view. Also note that any of the distance estimation methods could give an approximation for the size of other objects with known size. For example, Zball could be found by the ball size and then used to estimate the height of the person or straight club in the image.

[0073] FIG. 7 is a flow diagram of a method to align x and y axes of IMU coordinate space and camera coordinate space. Aligning the XY plane can be done by finding the yaw angle. This method can be performed by Module 3: X-Y-Axis alignment (R(.psi.)).

[0074] To estimate yaw angle, two solutions are proposed. One solution is to maximize the projection similarity of the IMU system golf club line and the detected club line in frames of the video. The other solution is to minimize the error between the re-projected IMU system location and the detected IMU system location in 2D video. To further refine a result, an iterative process is applied to remove the outliers. In the following embodiment, the alignment is performed using the golf club line.

[0075] In an action 702, a golf club line is detected in multiple frames. In an action 704, the yaw angle is estimated according to equation 12. In an action 706, outliers are detected, and an outlier set is formed. In a determination action 708, it is determined whether an outlier set is empty. If the answer is yes, flow proceeds to the action 712, and the yaw angle is output. If the answer is no, flow proceeds to the action 710, to remove the outlier, and from there to the action 704 in order to iterate the estimation of the yaw angle.

[0076] If the outlier set is not empty in the determination action 708, flow alternatively could proceed to the action 714 to remove the outlier, and from there to action 718. Action 718 is also preceded by action 716, in which the golf club line is detected in multiple frames. In the action 718, the yaw angle is estimated according to equation 16 as an alternative to the action 704 using equation 12. Action proceeds from the action 718 to the action 706, to detect outliers.

[0077] The yaw angle is estimated by maximizing the projection similarity of the M-Tracer.TM. golf club line and the detected club in the frames. Assume that the detected line at frame i is l.sup.i=[l.sub.x1.sup.i, l.sub.y1.sup.i, l.sub.x2.sup.i, l.sub.y2.sup.i] and the corresponding M-Tracer.TM. golf club line (i.e., the line connecting M-Tracer.TM. position to the club head position) is denoted by

mtl.sup.i=[l.sub.x1.sup.i, l.sub.y1.sup.i, l.sub.z1.sup.i, l.sub.x2.sup.i, l.sub.y2.sup.i, l.sub.z2.sup.i].

[0078] The system solves the following maximization equation to find the yaw angle:

max arg.sub..psi..SIGMA..sub.ioutliers sim(.pi.(K,T,mtl.sup.i),l.sup.i), (Eq. 12)

where .pi. is the image projection function, K is the camera intrinsic matrix, T is the final transformation matrix from the IMU system coordinate system to the camera system that is defined as follows:

T = [ R ( .theta. , .phi. ) R ( .psi. ) T MC 0 1 ] and R ( .psi. ) = [ cos .psi. sin .psi. 0 - sin .psi. cos .psi. 0 0 0 1 ] , ##EQU00007##

and sim the similarity of the projected line and detected line at frame i. Explanation of each part is given below.

[0079] The image projection function, .pi.(K, T, l), projects a 3D line segment 1=[l.sub.x1, l.sub.y1, l.sub.x2, l.sub.y2, l.sub.z2] determined by two points [l.sub.x1, l.sub.y1, l.sub.z1].sup.T, [l.sub.x2, l.sub.y2, l.sub.z2].sup.T into a 2D line segment l.sup.proj=[l.sub.x1.sup.proj, l.sub.y1.sup.proj, l.sub.x2.sup.proj, l.sub.y2.sup.proj]:

.pi. ( K , T , [ l x 1 , l y 1 , l z 1 , l x 2 , l y 2 , l z 2 ] ) : { l x 1 proj = ( KT [ l x 1 , l y 1 , l z 1 , 1 ] T ) x ( KT [ l x 1 , l y 1 , l z 1 , 1 ] T ) z , l y 1 proj = ( KT [ l x 1 , l y 1 , l z 1 , 1 ] T ) y ( KT [ l x 1 , l y 1 , l z 1 , 1 ] T ) z , l x 2 proj = ( KT [ l x 2 , l y 2 , l z 2 , 1 ] T ) x ( KT [ l x 2 , l y 2 , l z 2 , 1 ] T ) z , l y 2 proj = ( KT [ l x 2 , l y 2 , l z 2 , 1 ] T ) y ( KT [ l x 2 , l y 2 , l z 2 , 1 ] T ) z . ( Eq . 13 ) ##EQU00008##

and sim as:

sim ( [ l x 1 1 , l y 1 1 , l x 2 1 , l y 2 1 ] , [ l x 1 2 , l y 1 2 , l x 2 2 , l y 2 2 ] ) = ( [ l x 1 1 - l x 2 1 , l y 1 1 - l y 2 1 ] [ l x 1 2 - l x 2 2 , l y 1 2 - l y 2 2 ] ) 2 ( ( l x 1 1 - l x 2 1 ) 2 + ( l y 1 1 - l y 2 1 ) 2 ) ( ( l x 1 2 - l x 2 2 ) 2 + ( l y 1 2 - l y 2 2 ) 2 ) , ( Eq . 14 ) ##EQU00009##

where . is the dot product.

[0080] An alternate method is to perform alignment with IMU system location. The yaw angle can also be estimated by minimizing the re-projection error of the IMU system location. To detect the IMU system device in the 2D video frames, a color or texture marker can be attached to the device. An alternative technique is to detect the LED light source on the device.

[0081] Let [X.sub.MT.sup.i, Y.sub.MT.sup.i, Z.sub.MT.sup.i].sup.T be 3D location of IMU system at time i represented in IMU system coordinate space and its projection to 2D image space, denoted as [] can be represented as

[x.sub.MT.sup.i,y.sub.MT.sup.i,z.sub.MT.sup.i].sup.T=K(R(.theta.,.phi.)R- (.psi.)[X.sub.MT.sup.i,Y.sub.MT.sup.i,Z.sub.MT.sup.i].sup.T+T.sub.MC) (Eq. 15) [0082] =x.sub.MT.sup.i/x.sub.MT.sup.i [0083] =y.sub.MT.sup.i/z.sub.MT.sup.i

[0084] Let [u.sub.MT.sup.i, v.sub.MT.sup.i] be the detected IMU system location in 2D video frame, the yaw angle can be estimated by minimizing the reprojection error

min arg.sub..psi..SIGMA..sub.iOutliers.parallel.(u.sub.MT.sup.i-,(v.sub.- MT.sup.i-).parallel. (Eq. 16)

[0085] Where .parallel...parallel. denotes the norm 2 metric.

[0086] After initial estimation the yaw angle by either method described above, the results are further refined via an iterative outlier removal process, in some embodiments.

(1) Use N frames to estimate yaw angle and calculate transformation T=[R(.theta., .phi.)R(.psi.)|T.sub.MC] (2) Find outliers, O, using estimated T (3) Set new outliers, O.sub.new=O (4) Do the following until O.sub.new is empty [0087] (a) Estimate T by removing outliers (O) from computation [0088] (b) Find outliers, O.sub.new, using estimated T [0089] (c) O=O+O.sub.new,

[0090] To find outliers for a given transformation T (assuming that T is approximately correct), the system checks if the 2D detected lines are parallel to the 3D projected lines. This check is done by comparing the distance of each two ends of the projected 3D line segment to the corresponding 2D detected line. If the difference of the distances is higher than a threshold then the detected 2D line is an outlier.

[0091] Assume that l.sup.proj=[l.sub.x1.sup.proj, l.sub.y1.sup.proj, l.sub.x2.sup.proj, l.sub.y2.sup.proj] is the projection of a 3D line, l, using the operator .pi.(K, T, l) and the corresponding 2D detected line is l.sup.2D=[l.sub.x1.sup.2D, l.sub.y1.sup.2D, l.sub.x2.sup.2D, l.sub.y2.sup.2D]. The outlier function .circleincircle. (l.sup.2D, l.sup.proj, .di-elect cons.) returns 1 if the 2D detected line is outlier and is defined as:

.circle-w/dot. ( l 2 D , l proj , .epsilon. ) = { 0 , if ( l 2 D , [ l x 1 proj , l y 1 proj ] ) - ( l 2 D , [ l x 2 proj , l y 2 proj ] ) .ltoreq. .epsilon. 1 , otherwise ( Eq . 17 ) ##EQU00010##

where .theta. is the distance of a point to a line:

( l 2 D , [ l x proj , l y proj ] ) = ( l x proj - l x 1 2 D ) ( l y 2 2 D - l y 1 2 D ) - ( l y proj - l y 1 2 D ) ( l x 2 2 D - l x 1 2 D ) ( l x 1 2 D - l x 2 2 D ) 2 + ( l y 1 2 D - l y 2 2 D ) 2 . ( Eq . 18 ) ##EQU00011##

[0092] FIG. 8 is a flow diagram of a method for user assisted mode. This can be performed by Module 4: User Assisted Mode. Note that the user does not need to give a precisely accurate location, but a region of interest is considered around the given location for ball and line detection algorithms (described with reference to FIGS. 11-15).

[0093] In an action 802, the video with the IMU system trajectory overlaid is showed. In a decision action 804, it is determined whether the user is satisfied. If yes, the flow ends. If no, flow proceeds to the action 806, to ask the user to click on a location of a ball (e.g., with a touchscreen or mouse or other user input). In an action 808, module 2 for ball detection is run on a window around the click location. In an action 810, module 3 for x y axis alignment is run. In an action 812, the video with the IMU system trajectory overlaid is illustrated. In a decision action 814, it is determined whether the user is satisfied. If yes, the flow ends. If no, flow proceeds to the action 816, to ask the user to click on the location of the IMU system in a new frame. In an action 818, a region of interest is defined around the click location for this frame, and made to the initial input four module 3. Flow proceeds to the action 810, to run module 3 again.

[0094] FIGS. 9A and 9B show examples of user assisted mode: clicking on a location of the golf ball 902 (FIG. 9A) in a video frame 906 and clicking on a location of the IMU 904 (FIG. 9B) in a video frame 108. This could be performed using a touchscreen or cursor controls on a mobile device displaying the video frame. Alternatively, this could be performed on another computing device with a view screen and other types of user entry devices such as keyboard, mouse, trackball, etc.

[0095] The above sections describe the method to align 2D video and IMU system signal by using a mobile device with an embedded IMU sensor. In this section, an alignment method with a regular camera that does not have an IMU sensor is described. The method performs the alignment by using a marker placed on the floor as shown in FIGS. 10 and 11.

[0096] FIG. 10 depicts a marker coordinate system. In this embodiment, the marker coordinate system has x and y axes 1008, 1010 on a plane defined by the marker 1002, which could be a sheet or a plate (e.g. of paper, cardboard, plastic, metal or cloth or other material) with dots or other pattern 1004 on a surface. The z axis 1006 points upward, perpendicular to the plane and the sheet or plate.

[0097] FIG. 11 shows measurements 1102 of a ball location in marker coordinate space. The marker of FIG. 10 is placed on the floor at some distance from the ball. The distance from the ball can be determined by x, y and z axis measurements.

A review of Eq. 1 shows the equation can be further extended as

[ x c , y c , z c ] T = R MC [ x T , y T , z T ] T + T MC = R RC ( R MR [ x T , y T , z T ] T + T MR ) + T RC ( Eq . 19 ) ##EQU00012##

[0098] R.sub.RC and T.sub.RC are rotation and translation form marker space to camera space that can be easily obtained by any marker detection algorithm. Thus the only remaining unknowns are R.sub.MR and T.sub.MR which are denoted as the rotation and translation from IMU system space to marker space. FIG. 10 illustrates the marker coordinate system. According to right hand rule, the Z axis of the marker is perpendicular into the marker plane.

[0099] T.sub.MR is the translation from the IMU system coordinate system to the marker coordinate system, which is the 3D location of the ball represented in marker coordinate space. This 3D location can be measured manually from the ball to the marker origin. An example is illustrated in FIG. 11 and the T.sub.MR=[-|X|, -|Y|, -|Z|].sup.T.

[0100] Another way to determine translation is to use the method described above with reference to FIG. 5 and module 2. Eq. 19 can be re-written as

[ x c , y c , z c ] T = R MC [ x T , y T , z T ] T + T MC = R RC ( R MR [ x T , y T , z T ] T + T MR ) + T RC = R RC R MR [ x T , y T , z T ] T + T MC ( Eq . 20 ) ##EQU00013##

where T.sub.MC can be obtained following the method described with reference to FIG. 5 and module 2. User assisted mode could be activated to help ball detection as described above with reference to FIG. 8 and module 4.

[0101] Due to the fact that the marker is placed on the floor, it is reasonable to assume that the negative Z-axis of marker space is the gravity_up direction that aligns with z-axis of the IMU system. The marker can be placed at a variety of locations as along as the marker plane is perpendicular to gravity. Thus,

R MR = [ 1 0 0 0 1 0 0 0 - 1 ] R ( .psi. ) , ##EQU00014##

where R(.psi.) is the rotation around z-axis and .psi. is the yaw angle. The estimation of R(.psi.) can be done using method described with reference to FIG. 7 and module 3. User assisted mode could be activated to help ball detection as described with reference to FIG. 8 and module 4.

[0102] In order to detect a golf club in a frame, in one embodiment, first a background model is constructed. The background model is the average of the frames:

BG = 1 e - s i = s e I i , ##EQU00015##

where s and e represent the first and last frames to average. The best range [s e] is the range covering fast moving frames (e.g., from golf club top position to impact) which can be obtained from the synchronized IMU system data. For the frame I first the background is subtracted and then the image is normalized:

I i = I i - BG , I i ( x , y ) = I i ( x , y ) - min ( I i ( x , y ) ) max ( I i ( x , y ) ) - min ( I i ( x , y ) ) 255. ##EQU00016##

[0103] Now the Canny edge detector and then the probabilistic Hough transform are applied to find lines. The lines are then filtered by following constraints:

[0104] Angle constraint: the angle of the line with x axis should be within the following range: [a-t a+t] or [-a-t-a+t], where a is the expected angle of the golf club with the x axis (e.g., 60.degree.) and t the tolerance (e.g., 10.degree.). Note, this constraint limits the detection of the golf club at some specific positions. However it helps to reduce many false positives. As only one parameter needs to be estimated, a couple of accurately detected lines will be sufficient to solve the unknown.

[0105] Color/Width constraint: the line should have the given minimum color with the minimum width. If multiple lines are detected, the longest line, l.sup.l=[l.sub.x1.sup.l, l.sub.y1.sup.l, l.sub.x2.sup.l, l.sub.y2.sup.l] (with [l.sub.x1.sup.l, l.sub.y1.sup.l] denoting the start point and [lx.sub.2, l.sub.y2.sup.l] the end point of the line), is considered as the golf club. The other lines, l.sup.i, are compared with l.sup.l and if they are consistent (i.e., approximately parallel to l.sup.l and are within a specific distance), they are kept. Finally, l.sup.l and the consistent lines l.sup.i are averaged as follows:

[ l x 1 avg , l y 1 avg ] = [ 1 N + 1 ( l x 1 l + i l x 1 i + t v x i ) , 1 N + 1 ( l y 1 l + i l y 1 i + t v y i ) , ] [ l x 2 avg , l y 2 avg ] = [ 1 N + 1 ( l x 2 l + i l x 2 i + t v x i ) , 1 N + 1 ( l y 2 l + i l y 2 i + t v y i ) , ] ##EQU00017##

where [l.sub.x1.sup.avg, l.sub.y1.sup.avg] denotes the start point of the average line, [l.sub.x2.sup.avg, l.sub.y2.sup.avg], the end point of the average line, t the parametric value of the line and v.sup.i=[v.sub.x.sup.i, . . . v.sub.y.sup.i] the vector representing the direction:

t = ( l x l - l x i ) v x i + ( l y l - l y i ) v y i v i 2 [ v x i , v y i ] = [ l x 2 i - l x 1 i , l y 2 i - l y 1 i ] ##EQU00018##

This average line is computed for each given frame.

[0106] For golf club tracking, and in order to improve the robustness of the algorithm, in some embodiments the start and end points of the line along with their average (i.e., the middle point) are tracked by the pyramidal implementation using the Lucas-Kanade tracking method in one embodiment. If no line is detected but the points are tracked properly, the line is estimated by fitting a line to the points (e.g., by least square method).

[0107] For ball detection in some embodiments, in the 2D image the system first detects the golf club (see above) in the address pose (i.e., relatively static frames before the swing starts). Assume that the golf club is detected as [GC.sub.x1, GC.sub.y1, GC.sub.x2, GC.sub.y2], where [GC.sub.x1, GC.sub.y1] denote the upper side of the club (e.g., IMU system position) and [GC.sub.x2, GC.sub.y2] is the head of the club. The circular pattern centered at [x, y] and of radius of R is defined as:

Pat ( x ' , y ' ) [ x , y , R ] = { 1 , ( x - x ' ) 2 + ( y - y ' ) 2 .ltoreq. R 0 , otherwise . ##EQU00019##

[0108] The system searches for the circle pattern in a window W by extending the golf club from its head (Ext):

W={[x,y].parallel.[x,y]-Ext.parallel..ltoreq.nr},(n.gtoreq.1)

Ext={[x,y]=[(GC.sub.x2-v(1)i)=R/2,(GC.sub.y2-v(2)i)-R/2]},(a.ltoreq.i.lt- oreq.b),

where, .parallel. .parallel. is L2 norm, a and b denote the search range along the extended line (i.e., represented by a parametric equation), and v the club line direction vector:

v=[GC.sub.x1-GC.sub.x2,GC.sub.y1-GC.sub.y2]

[0109] To get the initial estimation, the system finds the circular pattern in window, W, by maximizing the cross correlation of the pattern and image I:

arg max [ x , y ] .di-elect cons. W x ' , y ' Pat ( x ' , y ' ) [ x , y , R ] I ( x ' , y ' ) . ##EQU00020##

[0110] Once the initial center of the circle, [x, y], is found, a sub image at [x, y] with size of R is considered, the Canny edge detector and then the circle Hough transform are applied to find the center and radius of the golf ball (i.e., [x.sub.ball, y.sub.ball] and Rad.sub.ball). The golf ball may have a non-circular shape due to illumination conditions. Here, several methods are presented make ball detection and size estimation more robust.

[0111] FIGS. 12A and 12B show symmetry of intensity 1202 (FIG. 12A) and symmetry of gradient direction 1204 (FIG. 12B) in images of golf balls. The first method uses symmetry of the ball to exclude false positives. This symmetry holds for intensity and gradient direction, as shown in FIGS. 12A and 12B. If the detected object does not have a symmetric intensity and gradient direction, it is rejected. In order to find symmetry the center of the ball is needed.

[0112] FIGS. 13A and 13B depict using a distance transform to find the center of the ball. The distance transform could be used to find the center of the circle. Distance transform has been used to robustly find the maximum inscribed circles and could be more uniformly likely to find the center of the ball. FIG. 13A shows an example in which the Hough transform gives a smaller circle 1302 due to a shadow problem, but the distance transform 1304 can be used to give a better estimation for the ball center.

[0113] FIG. 14 depicts fitting a semicircle 1402 to the ball. The next method is to fit a semi-circle instead of a full circle. Shadow usually makes the circle look smaller than its correct size. By fitting a semi-circle, as shown in FIG. 14, the effect of shadows can be over, and a better estimate of the right size of the ball can be made. To fit a semi-circle, the system could consider only large gradients, which typically occurs in non-shadow regions, or use color consistency for high intensity values.

[0114] FIGS. 15A and 15B depict intensity profiles of lines through images of golf balls. In FIG. 15A, an intensity profile 1502 of a line 1504 not passing through the center of the ball is shown. In FIG. 15B, an intensity profile 1506 of a line 1508 passing through the center of the ball is shown. The next method determines the size of the ball and its center from the intensity profile of a line through a circle. To use such intensity profiles, the intensity profile of each of multiple lines is determined. Then, the line with the largest width is selected. The line with the largest width is passing through the center in this embodiment. In this embodiment, width is the diameter of the ball and the mid-point in the intensity profile is the center of the circle, i.e., the center of the ball.

[0115] Temporal alignment is described next. Since the IMU system and the camera system run on different clocks, a method is needed to align the timing of the systems together, i.e., temporal alignment. In this disclosure a method to automatically synchronize the timing of IMU system and the camera system is described. After synchronization, the IMU system captured signal (i.e., the IMU captured golf club swing trajectory) and video (i.e., the two-dimensional video of the golf club swing) are temporally aligned and can be further aligned spatially.

[0116] With reference to FIGS. 1 and 16-20, a method and apparatus to synchronize (i.e., temporally align) the IMU system captured golf swing trajectory signal with the 2D video captured by a static camera are disclosed. The method has the following steps: [0117] (1) Detect the golf ball location from a 2D video frame. [0118] (2) Detect an impact frame by searching backward from the last frame of the video for the first frame that the ball appears, i.e., the brightness at the ball location becomes large enough. In order to remove false detection due to environmental noise such as lighting change, background object(s) etc., the result is further verified by using a temporal pattern. [0119] (3) Calculate the time bias by aligning the detected impact frame from the video and the impact time from the IMU system. The time bias is the time difference between the first frame of the video and the first frame of the IMU system signal. [0120] (4) Refine time bias by minimizing spatial alignment error with the temporally shifted IMU system golf club swing signal.

[0121] A method of detecting impact time (i.e., the moment at which the golf club head impacts the golf ball) is as follows. The method compares the image intensity change of neighboring two frames (i-1.sup.th and i.sup.th frame) of a video of a golf club swing, at a determined ball location relative to each of the frames. If such change is larger than a threshold, the impact time is determined as the i-1.sup.th frame. This is further described below with reference to FIGS. 17 and 18.

[0122] Using intensity change only, to detect impact time usually suffers from noise from environmental factors such as lighting change and background object occlusion, resulting in false detection. Different from and improving upon the method above, a further method disclosed herein uses a temporal pattern to further verify the initial detection result from image intensity so that detection accuracy is improved. In addition, the method to synchronize the IMU system signal and video by using the detected impact frame is disclosed herein.

[0123] FIG. 16 is a flow diagram of a method for determining a time bias between a video frame and an inertial measurement unit sample, for synchronizing. One goal of the present system and method embodiments is to synchronize the IMU system signal and the video signal, i.e., determine the time bias between the first video frame and first IMU system sample. Let N be the number for frames in the video. In an action 1602, the background image is estimated from the last few images. In an action 1604, ball detection is performed and the ball location is obtained, relative to coordinates of the frame. In an action 1606, a parameter i (e.g., a counting parameter) is set equal to N, the number of frames in the video. In an action 608, the background is subtracted from frame i. In a decision action 1610, it is determined whether the intensity at a pixel at the previously determined ball location but in the background subtracted frame i is greater than a threshold. If no, in the action 618, the parameter i is decremented. Flow proceeds to action 1608, for the next frame i. If yes, in the decision action 1610, flow proceeds to the decision action 1612, to determine whether a temporal pattern is satisfied. If no, the parameter i is decremented in the action 1618 and flow proceeds back to the action 1608 for the next frame i. If yes, in the decision action 1612, the initial time bias is estimated in the action 1614, using the RGB video 1620 and the IMU system trajectory with timestamp 1622. Then, the time bias is refined in the action 1616, using the IMU system trajectory with timestamp 1622 and camera intrinsics 1624.

[0124] The system takes RGB video, IMU system trajectory with time stamp information and camera intrinsics as input. The system first detects a golf ball location from 2D video frame(s). Thus proceeding backwards from the last frame, where the ball is assumed to be outside of the view, the system checks the intensity value of the detected ball center, in each frame. If such intensity value is bigger than a threshold and is further verified by a temporal pattern, the current frame will be determined as the impact time frame. The initial time bias can then be calculated with a known IMU system impact time stamp as well as the video frame rate. Due to the nature of the method, the accuracy of the time bias is at most within 1 frame. A refinement process is followed to further improve the accuracy by shifting IMU system trajectory samples by .DELTA.t around the calculated time bias and finding the optimal .DELTA.t with the smallest spatial alignment error.

[0125] Image processing methods described above for detecting a ball-shaped object can be applied to detecting the golf ball from the 2D images in the video. In some embodiments, instead of detecting the ball appearance directly on a video frame, such detection is performed by the system on background subtracted images to reduce background noise. Let V.sub.i denote the i.sup.th frame and l.sub.i denote the background subtracted frames defined as

I i = V i - BG , I i ( x , y ) = I i ( x , y ) - min ( I i ( x , y ) ) max ( I i ( x , y ) ) - min ( I i ( x , y ) ) 255 , ##EQU00021##

where .parallel. is the absolute value and BG is the average of the last M (e.g., M=10) frames that we are sure the ball does not appear:

BG = 1 M i = N - M + 1 N V i , ##EQU00022##

where N is the total number of frames.

[0126] Starting from the last frame of the video of the golf club swing, the system detects detect when there is a sudden intensity change in the location of the ball, i.e., check each frame at the ball location [u, v] to determine if the corresponding value is bigger than a threshold. The sudden intensity change in the ball location does not always suggest the impact time. In some cases, there is a possibility of noise or other objects appearing at the location of the ball and if the change in intensity is greater than the threshold, it may cause a false detection. An example of such a scenario is presented in FIG. 17.

[0127] Therefore, in some embodiments, a verification step is followed after such an intensity change frame is found. The following pseudo code illustrates the process.

TABLE-US-00001 for i = N : 1 : -1 if (l.sub.i(u, v) > T) if VerificationUsingTemporalPattern(l.sub.n(u,v),n = i - K, ..., i + K) = true id_impactFrame = i exit for loop end end end

A threshold T can be defined as the average of maximum and minimum of the intensity value at ball location T=1/2 (max(l.sub.i(u, v)+min(I.sub.i(u, v)), for i=1:N

[0128] FIGS. 17A-C show an example of false impact frame 1704 detection. Here, the golf club appears in the same location as the golf ball after the swing. When the algorithm starts from the end of video, Frame 102 of FIG. 17A appears before Frame 68 of FIG. 17B and the change in intensity 1702 at the location of the ball in Frame 102 is higher than the threshold because of the golf club as illustrated in FIG. 17C. This can cause the algorithm to find the wrong impact frame in some embodiments.

[0129] FIG. 18 depicts a temporal pattern of intensity 1802 at the ball location, in video frames. In the verification step, the temporal pattern is used based on the observation that the intensity value before impact frames should be high (when ball exists) and the intensity value after impact should be low (ball disappears). FIG. 18 illustrates a binary temporal pattern template with a window size of A+B frames, i.e., P.sub.tmp=[1, 1, . . . 1, 0, . . . , 0]. Once an impact frame candidate is found, the system checks the corresponding intensity value for continues A frames before and B frames after and calculates its temporal pattern P.sub.i defined as

P.sub.i=[b.sub.n,n=i-A:i+B]

b n = { 1 I n ( u , v ) > T 0 I n ( u , v ) .ltoreq. T ##EQU00023##

[0130] Thus the impact frame candidate will be determined as a true impact frame if (P.sub.i, P.sub.tmp)<T.sub.P, where D(.) is a distance metric and T.sub.P is a pre-defined threshold. In one example D (.) can be defined as Hamming distance.

[0131] The system estimates initial time bias as follows. Let f.sub.v and f.sub.M denote the frame rate of video and IMU system signal, let i.sub.v and i.sub.m denote the impact frame detected from video and impact sample detected from the IMU system. Thus the time bias t.sub.diff=i.sub.vf.sub.v-i.sub.mf.sub.m.

[0132] However, in various embodiments of the system, the time bias should be refined. Due to the finite frame rate of the video, the highest accuracy of temporal synchronization based on the above approach is within .+-.1 video frames. In some embodiments, the frame rate (i.e., the sampling rate for IMU data) of the IMU system is much higher than camera (e.g., video f.sub.v=30 frames per second, IMU f.sub.m=100 frames per second). Thus, a refinement step is performed to further improve accuracy of time bias up to .+-.1 IMU system frame.

[0133] With t.sub.diff as initial temporal alignment, spatial alignment is performed as described with reference to FIGS. 1-15. Then, the IMU system signal is shifted .DELTA.t (i.e., one or multiple IMU system sample intervals) forward and backward as shown in FIG. 19.

[0134] FIG. 19 depicts temporal alignment by changing the time offset of the inertial measurement unit samples or frames relative to video frames. Three alignments are shown, with the IMU system samples 1902, 1904, 1906 aligned to the video frame 1908 at moment of ball impact with offset 0 1902 (initial time bias), offset +1 1904, and offset -1 1906.

[0135] For each adjustment .DELTA.t, corresponding spatial alignment error is calculated and the temporal alignment with the smallest spatial alignment error is selected.

T.sub.diff=t.sub.diff+.DELTA.t.sub.m

.DELTA.t.sub.m=argmin.sub..DELTA.t.sub.iErr(v,MT(t-t.sub.diff-.DELTA.t.s- ub.i))

[0136] where Err(.) denote the spatial alignment error measurement, v and MT denote the trajectories obtained from video and IMU system.

[0137] FIG. 20 is a block diagram of a golf coaching system in accordance with the present disclosure. An inertial measurement unit 2001, such as the M-Tracer.TM. in some embodiments, is attached to a golf club 2003, and is used for capturing inertial measurement of a golf club swing by a golfer. A camera 2005 of a mobile device 2007 captures a two-dimensional video of the golf club swing. The mobile device 2007 is equipped with an inertial measurement unit 2009 and a display 2011.

[0138] A computing device 2013 as a processor 2017, memory 2019, and a wireless module 2015. The computing device receives captured inertial measurement of the golf club swing from the inertial measurement unit 2001 attached to the golf club 2003, via the wireless module 2015 of the computing device 2013. Also, the computing device receives the captured video from the camera 2005 of the mobile device 2007, via the wireless module 2015 of the computing device 2013.

[0139] The computing device has a z axis alignment module 2021, a translation module 2023, an x y axis alignment module, a user assisted mode module 2027, a temporal alignment module 2029, and an overlay module 2031, each of which could be implemented in software executing on the processor 2017, hardware, firmware, or combination thereof. These modules implement functions described above with reference to FIGS. 1-19. The computing device 2013 performs spatial and temporal alignment of the golf club inertial measurement data and the two-dimensional video for the golf club swing, and overlays a projected golf club swing onto the video. Then, the computing device 2013 sends the overlaid video, via the wireless module 2015 of the computing device 2013, to the mobile device 2007, which shows the overlaid video on the display 2011 of the mobile device 2007.

[0140] In a further embodiment, the computing device 2013 is integrated with the mobile device 2017, for example when the mobile device 2017 is a tablet that has a camera 2005, and inertial measurement unit 2009 and a display 2011. Many tablets, as is known, also have one or more processors and memory, a wireless module, user input devices, etc. In a still further embodiment, the camera 2005 is separate, and a display 2011 is coupled to or integrated with the computing device 2013. Video could be delivered from the camera 2005 to the computing device 2013 via a wired connection, or removal of memory from the camera 2005 and insertion of the memory into the computing device 2013, e.g., using a memory stick. In a yet further embodiment, the computing device 2013 is combined with the inertial measurement unit 2001 attached to the golf club 2003, so that the system needs only a mobile device 2007 with a camera 2005, or a separate camera 2005, and the inertial measurement unit with combined computing device 2013.

[0141] FIG. 21 is a flow diagram of a method for spatial alignment of golf-club inertial measurement data and two-dimensional video for golf club swing analysis. The method can be practiced by one or more devices of a golf coaching system, in various embodiments as described above with reference to FIGS. 1-20 and variations thereof, and by one or more processors. Specifically, the method can be practiced by an inertial measurement unit attached to a golf club, a device that has a camera, and a computing device, which could be separate from or integrated with the inertial measurement unit and/or the device that has the camera. In various embodiments, the method is practiced by and with the M-Tracer.TM., a computing device and/or a mobile computing device such as a tablet or a smart phone.

[0142] In an action 2101, a golf club swing is captured, using an inertial measurement unit attached to a golf club. In an action 2103, a video of the golf club swing is captured, using a camera. The camera could be part of a smart phone or a tablet or other mobile device, or a camera attached to a computing device, or a separate camera. These actions 2101, 2103 should be performed at the same time, so that the video captures the same golf club swing as is captured by the inertial measurement unit. The inertial measurement unit sends the captured golf club swing to a computing device, in various embodiments. The device with the camera sends the captured video of the golf club swing to a computing device, in various embodiments, or acts as the computing device in further embodiments.

[0143] In an action 2105, the z axis for the inertial measurement unit captured golf club swing and the captured video of the golf club swing are aligned. In an action 2107, translation of the inertial measurement unit captured golf club swing to the video of the golf club swing is determined. In an action 2109, x and y axes of the inertial measurement unit captured golf club swing and the captured video of the golf club swing are aligned. These actions 2105, 2107, 2109 can be performed by a computing device, e.g., by the processor of a computing device.

[0144] In an action 2111, a projected golf club swing trajectory is overlaid onto the video. In some embodiments, this uses time bias for temporal alignment. This action 2111 can be performed by the computing device. In an action 2113, the overlaid video is displayed. This action 2113 can be performed by the computing device, using a display attached to the computing device. In some embodiments, the computing device could be part of a mobile device, so that the mobile device is performing the actions 2105, 2107, 2109, 2111, 2113. Or, in some embodiments, the computing device sends the overlaid video to a mobile device, and the mobile device displays the overlaid video on a screen of the mobile device, e.g., on the screen of a tablet or a smart phone.

[0145] It should be appreciated that the methods described herein may be performed with a digital processing system, such as a conventional, general-purpose computer system. Special purpose computers, which are designed or programmed to perform only one function may be used in the alternative. FIG. 22 is an illustration showing an exemplary computing device which may implement the embodiments described herein. The computing device of FIG. 22 may be used to perform embodiments of the functionality for temporal and spatial alignment in accordance with some embodiments. The computing device includes a central processing unit (CPU) 2201, which is coupled through a bus 2205 to a memory 2203, and mass storage device 2207. Mass storage device 2207 represents a persistent data storage device such as a floppy disc drive or a fixed disc drive, which may be local or remote in some embodiments. The mass storage device 2207 could implement a backup storage, in some embodiments. Memory 2203 may include read only memory, random access memory, etc. Applications resident on the computing device may be stored on or accessed via a computer readable medium such as memory 2203 or mass storage device 2207 in some embodiments. Applications may also be in the form of modulated electronic signals modulated accessed via a network modem or other network interface of the computing device. It should be appreciated that CPU 2201 may be embodied in a general-purpose processor, a special purpose processor, or a specially programmed logic device in some embodiments.

[0146] Display 2211 is in communication with CPU 2201, memory 2203, and mass storage device 2207, through bus 2205. Display 2211 is configured to display any visualization tools or reports associated with the system described herein. Input/output device 2209 is coupled to bus 2205 in order to communicate information in command selections to CPU 2201. It should be appreciated that data to and from external devices may be communicated through the input/output device 2209. CPU 2201 can be defined to execute the functionality described herein to enable the functionality described with reference to FIGS. 1-21. The code embodying this functionality may be stored within memory 2203 or mass storage device 2207 for execution by a processor such as CPU 2201 in some embodiments. The operating system on the computing device may be MS-WINDOWS.TM., OS/2.TM., UNIX.TM., LINUX.TM., iOS.TM., or other known operating systems. It should be appreciated that the embodiments described herein may also be integrated with a virtualized computing system that is implemented with physical computing resources.

[0147] Detailed illustrative embodiments are disclosed herein. However, specific functional details disclosed herein are merely representative for purposes of describing embodiments. Embodiments may, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.

[0148] It should be understood that although the terms first, second, etc. may be used herein to describe various steps or calculations, these steps or calculations should not be limited by these terms. These terms are only used to distinguish one step or calculation from another. For example, a first calculation could be termed a second calculation, and, similarly, a second step could be termed a first step, without departing from the scope of this disclosure. As used herein, the term "and/or" and the "/" symbol includes any and all combinations of one or more of the associated listed items.

[0149] As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises", "comprising", "includes", and/or "including", when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Therefore, the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting.

[0150] It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed substantially concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.

[0151] With the above embodiments in mind, it should be understood that the embodiments might employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Further, the manipulations performed are often referred to in terms, such as producing, identifying, determining, or comparing. Any of the operations described herein that form part of the embodiments are useful machine operations. The embodiments also relate to a device or an apparatus for performing these operations. The apparatus can be specially constructed for the required purpose, or the apparatus can be a general-purpose computer selectively activated or configured by a computer program stored in the computer. In particular, various general-purpose machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.

[0152] A module, an application, a layer, an agent or other method-operable entity could be implemented as hardware, firmware, or a processor executing software, or combinations thereof. It should be appreciated that, where a software-based embodiment is disclosed herein, the software can be embodied in a physical machine such as a controller. For example, a controller could include a first module and a second module. A controller could be configured to perform various actions, e.g., of a method, an application, a layer or an agent.

[0153] The embodiments can also be embodied as computer readable code on a tangible non-transitory computer readable medium. The computer readable medium is any data storage device that can store data, which can be thereafter read by a computer system. Examples of the computer readable medium include hard drives, network attached storage (NAS), read-only memory, random-access memory, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes, and other optical and non-optical data storage devices. The computer readable medium can also be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion. Embodiments described herein may be practiced with various computer system configurations including hand-held devices, tablets, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The embodiments can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.

[0154] Although the method operations were described in a specific order, it should be understood that other operations may be performed in between described operations, described operations may be adjusted so that they occur at slightly different times or the described operations may be distributed in a system which allows the occurrence of the processing operations at various intervals associated with the processing.

[0155] In various embodiments, one or more portions of the methods and mechanisms described herein may form part of a cloud-computing environment. In such embodiments, resources may be provided over the Internet as services according to one or more various models. Such models may include Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). In IaaS, computer infrastructure is delivered as a service. In such a case, the computing equipment is generally owned and operated by the service provider. In the PaaS model, software tools and underlying equipment used by developers to develop software solutions may be provided as a service and hosted by the service provider. SaaS typically includes a service provider licensing software as a service on demand. The service provider may host the software, or may deploy the software to a customer for a given period of time. Numerous combinations of the above models are possible and are contemplated.

[0156] Various units, circuits, or other components may be described or claimed as "configured to" perform a task or tasks. In such contexts, the phrase "configured to" is used to connote structure by indicating that the units/circuits/components include structure (e.g., circuitry) that performs the task or tasks during operation. As such, the unit/circuit/component can be said to be configured to perform the task even when the specified unit/circuit/component is not currently operational (e.g., is not on). The units/circuits/components used with the "configured to" language include hardware--for example, circuits, memory storing program instructions executable to implement the operation, etc. Reciting that a unit/circuit/component is "configured to" perform one or more tasks is expressly intended not to invoke 35 U.S.C. 112, sixth paragraph, for that unit/circuit/component. Additionally, "configured to" can include generic structure (e.g., generic circuitry) that is manipulated by software and/or firmware (e.g., an FPGA or a general-purpose processor executing software) to operate in manner that is capable of performing the task(s) at issue. "Configured to" may also include adapting a manufacturing process (e.g., a semiconductor fabrication facility) to fabricate devices (e.g., integrated circuits) that are adapted to implement or perform one or more tasks.

[0157] The foregoing description, for the purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the embodiments and its practical applications, to thereby enable others skilled in the art to best utilize the embodiments and various modifications as may be suited to the particular use contemplated. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.

* * * * *


uspto.report is an independent third-party trademark research tool that is not affiliated, endorsed, or sponsored by the United States Patent and Trademark Office (USPTO) or any other governmental organization. The information provided by uspto.report is based on publicly available data at the time of writing and is intended for informational purposes only.

While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, reliability, or suitability of the information displayed on this site. The use of this site is at your own risk. Any reliance you place on such information is therefore strictly at your own risk.

All official trademark data, including owner information, should be verified by visiting the official USPTO website at www.uspto.gov. This site is not intended to replace professional legal advice and should not be used as a substitute for consulting with a legal professional who is knowledgeable about trademark law.

© 2024 USPTO.report | Privacy Policy | Resources | RSS Feed of Trademarks | Trademark Filings Twitter Feed