EDGE EXTRACTION USING IMAGE AND THREE-AXIS TACTILE DATA

Publications

Share / Export Citation / Email / Print / Text size:

International Journal on Smart Sensing and Intelligent Systems

Professor Subhas Chandra Mukhopadhyay

Exeley Inc. (New York)

Subject: Computational Science & Engineering , Engineering, Electrical & Electronic

GET ALERTS

eISSN: 1178-5608

DESCRIPTION

1
Reader(s)
1
Visit(s)
0
Comment(s)
0
Share(s)

VOLUME 4 , ISSUE 3 (September 2011) > List of articles

EDGE EXTRACTION USING IMAGE AND THREE-AXIS TACTILE DATA

Sukarnur Che Abdullah * / Takuya Ikai / Yusuke Dosho / Hanafiah Bin Yussof / Masahiro Ohka

Citation Information : International Journal on Smart Sensing and Intelligent Systems. Volume 4, Issue 3, Pages 508-5,026, DOI: https://doi.org/10.21307/ijssis-2017-454

License : (CC BY-NC-ND 4.0)

Received Date : 12-June-2011 / Accepted: 22-August-2011 / Published Online: 01-September-2011

ARTICLE

ABSTRACT

This paper describes a hand-arm system equipped with optical three-axis tactile sensors and a binocular vision sensor. The vision compensates for the limitations of tactile information and tactile sensing, and vice versa. The tactile sensor can obtain geometrical data as real scale, while image data requires calibration to obtain length as a metric unit. Even if stereovision is used, we cannot obtain sufficient precision. In the evaluation test, the robotic hand equipped with tactile sensors traces an object including convex and concave portions to evaluate edge trace precision. Error of distance obtained by the binocular vision is around ±10 mm when distance between the camera and object is around 600 mm. When the hand-arm robot touches the convex portion of the object, size data obtained by the vision is modified within ± 0.5 mm accuracy. Since the robotic finger is too thick to touch the bottom of the concave, size data of the concave portion obtained by tactile sensing includes relatively large error of around 4 mm. However, the robot finger can follow the contour with ± 0.5 mm accuracy except for the bottom portion. Therefore, vision sensing is not sufficient for precise edge exploration and modification based on tactile sensing is required.

Content not available PDF Share

FIGURES & TABLES

REFERENCES

[1]. M. Ohka, N. Hoshikawa, J. Wada, and H. B. Yussof, Two Methodologies Toward Artificial Tactile Affordance System in Robotics, International Journal on Smart Sensing and Intelligent Systems, Vol. 3, No. 3, Sep. 2010, pp.466-487.
[2]. J. F. Canny, “A computational approach to edge detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. PAMI-8, No. 6, pp. 679-698, Nov.1986.
[3]. L. S. Davis, “A survey of edge detection techniques,” Computer Graphics Image Processing, vol. 4, pp. 248-270, 1975.
[4]. P. Eichel and E. Delp, “Sequential edge detection in correlated random fields,” in Proceeding of Conference Computuer Vision Pattern Recognition, San Francisco, CA, June 1985, pp. 14-21.
[5]. D. Marr and E. C. Hildreth, “Theory of edge detection,” Proceeding Royal Society London,vol. B-207, pp. 187-217, Feb. 1980.
[6]. R. Nevatia and K. R. Babu, “Linear feature extraction and description,” Computer Graphics and Image Processing, vol. 13, pp. 257-269, 1980.
[7]. R. M. Haralick, “Digital step edges from zero crossings of second directional derivatives,”IEEE Transaction Pattern Analysis and Machine Intelligent,vol. PAMI-6, pp. 58-68, Jan.1984.
[8]. N. R. Pal and S. K. Pal, “A Review on Image Segmentation Techniques”, Pattern recognition Society, Vol. 26, No. 9, pp. 1277-1294, 1993.
[9]. P. K. Allen and P. Michelman, “Acquisition and Interpretation of 3-D Sensor data from touch”, IEEE Transaction on Robotics and Automation, Vol. 6, No. 4, August 1990.
[10]. R. Araújo, U. Nunes and A.T. de Almeida, “3D Surface-Tracking with a Robot Manipulator”, Journal of Intelligent and Robotics System, 15: 401-417,1996.
[11]. S. Ahamed and C. N. Lee, “Shape recovery from Robot contour Tracking with Force Feedback” ,Advanced Robotics, Volume 5, Number 3, 1990 , pp. 257-273(17).
[12]. A. Fedele, A. Fioretti, C. Manes, G. Ulivi, “On-Line Processing of Position and Force Measures for Contour Identification and Robot Control”, Proceedings of IEEE International Conference on Robotics and Automation, 1993.
[13]. M. Ohka, T. Jumpei, K. Hiroaki, S. Hirofumi, M. Nobuyuki, and H. B. Yussof, “Object exploration and manipulation using a robotic finger equipped with an optical three-axis tactile sensor”, Robotica, 2009, 27: 763-770
[14]. M. Moll and M. Erdmann, “Reconstructing Shape from Motion using tactile Sensors”, Proceeding of the 2001 IEEE/RSJ International Conference on Intelligent Robot and Systems, Hawaii, USA, Oct. 29-Nov. 03, 2001
[15]. A. M. Okamura and M. R. Cutkosky, “Feature-Guided Exploration with a Robot Finger” Proceeding of 2001 IEEE International Conference on Robotics and Automation, Seoul,Korea, May 21-26, 2001.
[16]. M. Ohka, Y. Mitsuya, Y. Matsunaga, and S. Takeuchi, “Sensing Characteristics of an Optical Three-axis Tactile Sensor Under Combined Loading,” Robotica, vol. 22, 2004, pp.213-221.
[17]. M. Ohka, T. Kawamura, T. Itahashi, T. Miyaoka, and Y. Mitsuya, “A Tactile Recognition System Mimicking Human Mechanism for Recognizing Surface Roughness,” JSME International Journal, Series C. Vol. 48, No. 2, 2005, pp. 278-285.
[18]. M. Ohka, Y. Mitsuya, I. Higashioka, H. Kabeshita, “An Experimental Optical Three-axis Tactile Sensor for Micro-Robots,” Robotica, vol. 23-4, 2005, pp. 457-465.
[19]. H. Maekawa, K. Tanie, K. Komoriya, M. Kaneko, C. Horiguchi, and T. Sugawara,“Development of a Finger-shaped Tactile Sensor and Its Evaluation by Active Touch,”Proc. of the 1992 IEEE Int. Conf. on Robotics and Automation, 1992, pp. 1327-1334.

EXTRA FILES

COMMENTS