Let's assume you have two stepper motors, one per axis. One knows how many steps each motor makes per revolution (N_x, N_y). One also (hopefully) knows the linear distance that is equivalent to one rotation, for each axis (L_x, L_y)
There's no magic, this is basic plane geometry.
If your initial point is (x_0,y_0), for a number of steps on each motor (n_x, n_y) your new position is: (x_new, y_new) where
x_new = x_0 + n_x*(L_x / N_x)
and
y_new = y_0 + n_y*(L_y / N_y)
the angle, theta, between your old and new points, using the x+ direction as the zero heading, and measuring in a counter-clockwise sense is
arc_theta = n_y*(L_y / N_y) / n_x*(L_x / N_x)
You may want to investigate more sophisticated heading measurement to avoid pesky singularities and ambiguities.
I am using same thing, but i am use stepper motor count to find angle instead of revolution it will give me x,y coordinate in only forward direction when some time it will give right angle but some time it gave wrong angle
can you give me best way to find angle and coordinate
I confess, I am not at all clear now what your system looks like. Nils is perfectly right - wheel-slip, encoder errors, and so on all conspire to make odometry the worst (?) method for deducing position.
Do you have any other options (GPS, etc.?) And how accurate do you need the position to be?
Do you have a link to the system's description?
(Is this a vehicle with driven wheels? Or an X/Y plotter bed? Or something else? And if your stepper motors provide angles (!) then in the LabVIEW something must be reading the absolute angle (gray codes?) or doing some counting.)
i am using stepper motor whose angle is 3.75 so when 92 pulse is given to stepper motor it will rotate 1 round
i am measure pulse count using controller then it will transmit measure data through zigbee, another zigbee attach received data and then whole calculation is done in LABVIEW
using the same pulse i am calculate traveled distance by My small robot it show perfect reading.
but same time it will gave me wrong angle so can you suggest me how to calculate angle
And may I presume two wheels? One on the left, one on the right, of the robot?
So, when driving the wheels with an equal number of pulses, you'll get identical values for the distance travelled by each wheel.
So that matches with your description of, "using the same pulse i am calculate traveled distance by My small robot it show perfect reading."
What you want, I think, is to know the heading vector at any time (or the angular displacement from an origin) when both wheels have been driven with an arbitrary number of pulses.
This is quite hard.
You will have to track, at a finely divided timescale, the number of pulses that each wheel has logged (or, conversely, the angle through which each wheel has turned).
If you are driving each wheel with different pulse rates, then your craft will execute curved paths. You will need to find the partial arcs that describe the trajectory of the robot between manoeuvres and then add them vectorially.
This is not simple.
If you are driving the wheels such that you either have rotations *or* translations, then it's much easier.
But I suspect that ResearchGate isn't the best medium for this - would you not be able to find a willing soul at the University that you attend?
Odometry and dead reckoning is important for sure, to estimate the XY position using the angle measurement of the wheels: they are fast and they will work at any condition; as long as the wheels are turning (i mean the environmental issues, disturbances, etc.) However, they DO NOT detect the slip of the wheels or ARE NOT quite accurate at the task space.
So, to improve the XY positioning of mobile robots, most of the time a global measuring unit is introduced: depending on the work space of your device, and the working environment, many applications you can find in the literature I suppose (GPS, Kinect, optical sensors, cameras etc.) To integrate these new sensors with the odometry, the most common technique is to use Kalman filter based sensor fusion techniques. You can search for this technique and I am sure that you will find many articles, publications or applications..
it not possible to use GPS and kinect because my mobile robot size is 12 cm x 12 cm x 10 cm (length x width x height ) and it has to pass through the closed bounded area like 25cm x 25cm (width x height) or 30 cm round shape area
in this, i am count the pulse of stepper motor and this measured pulse is transmit through zigbee to user end (10m away) and whole calculation i am done in LabVIEW software
Last but not least, if you're working in a bounded area, you can use laser pointers or devices like that to estimate your position depending on how far your device is from the specified boundaries maybe.
Your calculations about this step motor sounds right, but just not accurate enough. You just need to verify these calculations with a global measurement.
Conference Paper Robust Mobile Robot Localization Using Optical Flow Sensors ...
Article Experimental Evaluation of a Fiber Optics Gyroscope for Impr...
Hey Raj, I am doing a similar project to find the new X and Y coordinates using Labview and the Laser Tracker. Is it possible that I could take help from you??
well im trying a new approach, with laser tracker I can find the position and orientation of the robot, but my problem is that I'm new to LabView and the coding looks very difficult