Mudanças entre as edições de "Amazon Picking Challenge"

De LCAD
Ir para: navegação, pesquisa
(Criou página com ' == Professor do LCAD obtém a sétima colocação na Amazon Picking Challenge== O professor Alberto F. De Souza, membro do Laboratório de Computação de Alto Desempenho, o...')
 
 
(26 revisões intermediárias por 2 usuários não estão sendo mostradas)
Linha 1: Linha 1:
  
== Professor do LCAD obtém a sétima colocação na Amazon Picking Challenge==
+
== LCAD Professor Achieves 7th Place at the Amazon Picking Challenge ==
  
O professor Alberto F. De Souza, membro do Laboratório de Computação de Alto Desempenho, obteve a sétima colocação na Amazon Picking Challenge, uma competição de robótica autônoma promovida pela Amazon ([http://amazonpickingchallenge.org]). O professor Alberto, em parceria com o grupo do prof. Kostas Bekris, da Rutgers University (USA), como parte de suas atividades de pós-doutoramento, desenvolveu hardware e software para o robô Motoman (SDA10, Yaskawa) utilizando várias tecnologias previamente desenvolvidas no LCAD. O resultado obtido coloca o LCAD, o PPGI e a UFES em um time que se encontra entre os 10 melhores em manipulação robótica autônoma do mundo.
+
[[Arquivo:Photo-icra.jpg|480px|centro]]
  
Reportagem da Bloomberg sobre a Amazon Picking Challenge com vídeo com entrevista do prof. Alberto.
+
Professor Alberto F. De Souza, coordinator of the '''High-Performance Computing Laboratory (LCAD)''', was part of a team that achieved 7th place at the Amazon Picking Challenge, an autonomous robotics competition organized by Amazon ([https://dl.acm.org/doi/abs/10.1609/aimag.v37i2.2659]). As part of his postdoctoral activities with Prof. Kostas Bekris's group (https://pracsys.cs.rutgers.edu/) at Rutgers University (USA), Professor Alberto developed hardware and software for the Motoman robot (SDA10F, Yaskawa - https://www.motoman.com/en-us/products/robots/industrial/assembly-handling/sda-series/sda10f) using various technologies previously developed at LCAD. This result places LCAD, PPGI, and UFES among the top 10 teams in autonomous robotic manipulation worldwide.
[http://www.bloomberg.com/news/articles/2015-05-28/robot-with-a-human-grasp-is-amazon-s-challenge-to-students]
 
  
Neste vídeo é apresentada a proposta inicial de sistema para o robô no início do projeto. A arquitetura do sistema final, utilizada na competição, é uma versão ligeiramente aprimorada.
 
[http://youtu.be/M7ktnz32BA0]
 
  
Neste vídeo, sao apresentados os sistemas de mapeamento, localização e detecção de objetos preliminares.
+
Bloomberg article about the Amazon Picking Challenge featuring a video interview with Prof. Alberto.
[https://youtu.be/47_Bj3o9s-Y]
+
[https://www.bloomberg.com/news/videos/2015-05-28/amazon-seeks-robots-with-human-grasp-from-students]
  
O detector de objetos utilizado é uma versão melhorada do detector Linemod, desenvolvida pelo prof. Alberto. Este vídeo mostra um teste após algumas das melhorias realizadas na versão publicamente disponível de Linemod. A versão final desenvolvida pelo prof. e utilizada na competição será disponibilizada como software livre.
+
Video showing the final version of the system developed for the competition (5x speed for better visualization):
[https://youtu.be/s_mtzXzKT7c]
 
  
Neste vídeo é apresentado o robô Baxter realizando todas as principais tarefas da competição. A Rutgers já possuia este robô, que foi utilizado na fase preliminar do projeto. O professor Alberto era o responsável pelos subsistemas de tomada de decisão, mapeamento, localização e detecção de objetos, equanto que a equipe do prof. Bekris era responsável pela parte de planejamento do movimento do robô (especialidade da equipe do prof. Bekris). Contudo, neste vídeo o sistema de planejamento dos movimentos do robô utilizado também foi desenvolvido pelo prof. Alberto com base no software MoveIt!, disponível na forma de software livre. A detecção de objetos ainda era viabilizada por uma versão preliminar do sistema utilizado na competição.
+
[[Image:13.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/8bXFiNWPc3E]]
[http://youtu.be/0K_PHgdauGo]  
 
  
Vídeo submetido à Amanzon em competição paralela promovida pela mesma para determinar a quem a empresa iria conceder uma estante e os objetos que seriam usados na competição. Nele é possível ver o planejador de movimentos do grupo do prof. Bekris em simulações.
 
[https://youtu.be/eFRg8EhO2Uk]
 
  
Vídeo submetido ao fabricante do robô que usado na competição, Motoman SDA10F, fabricado pela Yaskawa. Graças a este vídeo a Rutgers ganhou (por 6 meses) um robô de U$150,000.00.
+
Robot only:
[https://youtu.be/-Lz8SbGRlmw]
 
  
Vídeo submetido para a Amazon em uma competição paralela para receber apoio para transportar o robô para a competição (a Rutgers ganhou o apoio).
+
[[Image:14.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/PhxOCKRejGM]]
[https://youtu.be/H33EeTi3FaI]  
 
  
Versão mais avançada do sistema desenvolvido rodando no Baxter e pegando objetos na estante da competição.
+
This video presents the initial system proposal for the robot, designed at the beginning of the project. The final system architecture used in the competition is a slightly improved version.
[http://youtu.be/kx6mFj9pjPU]
 
  
Este manipulador a vácuo foi projetado pelo professor Alberto especialmente para a competição {colocar a imagem do rasculnho do projeto com a legenda: "Rascunho a mão do projeto do manipulador vácuo desenvolvido para a competição"}. Todos os pontos obtidos na competição vieram dele. A estrutura física básica dele foi implementada pela empresa Unigripper (http://www.unigripper.com). Mas o acionamento eletrônico, assim como o driver de acionamente computacional, foram implementados pelo prof. Alberto. A Rutgers ganhou o apoio da Unigripper após ela ter nos solecionado em uma competição paralela à Amazon Picking Challenge promovida pela empresa.
+
[[Image:1.jpg|thumb|600px|centro|nenhum|link=http://youtu.be/M7ktnz32BA0]]
[https://drive.google.com/file/d/0Bx72lgdcLPEpdzlObEs1SENNZ2M/view?usp=sharing]
 
  
Neste vídeo, o prof. Alberto apresenta o sistema completo que ele desenvolveu (inclusive o planejamento de movimento) rodando com a parte de mapeamento, localização e detecção de objetos no mundo real, mas simulando o robô utilizado na competição.
 
[https://drive.google.com/open?id=0Bx72lgdcLPEpdzlObEs1SENNZ2M&authuser=0]
 
  
Neste vídeo é apresentada a versão final do sistema utilizado na competição sendo empregada no robô utilizado na competição. No vídeo, o robô Motoman SDA10F realiza com sucesso uma das tarefas da competição {favor baixar este vídeo e colocar em destaque no canal do LCAD no Youtube}.
+
This video presents the preliminary mapping, localization, and object detection systems.
[https://drive.google.com/file/d/0Bx72lgdcLPEpUDZqc05hblpqdW8/view?usp=sharing]
+
 
 +
[[Image:2.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/47_Bj3o9s-Y]]
 +
 
 +
 
 +
The object detector used is an improved version of the Linemod detector. This video shows a test performed after some improvements were made to the publicly available version of Linemod. The final version developed and used in the competition will be released as open-source software.
 +
 
 +
[[Image:3.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/s_mtzXzKT7c]]
 +
 
 +
 
 +
This video shows the Baxter robot performing all the main competition tasks. Rutgers already had this robot, which was used in the preliminary phase of the project. Professor Alberto was responsible for the decision-making, mapping, localization, and object detection subsystems, while Prof. Bekris's team was responsible for the robot motion planning (Prof. Bekris's team specialty). However, in this video, the robot motion planning system was also developed by Prof. Alberto based on the MoveIt! software, available as open source. Object detection was still enabled by a preliminary version of the system used in the competition.
 +
 
 +
[[Image:4.jpg|thumb|600px|centro|nenhum|link=http://youtu.be/0K_PHgdauGo]]
 +
 
 +
 
 +
Video submitted to Amazon in a parallel competition to determine who would receive a shelf and the objects to be used in the competition. In it, you can see Prof. Bekris's group motion planner being used in simulations.
 +
 
 +
[[Image:5.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/eFRg8EhO2Uk]]
 +
 
 +
 
 +
Video submitted to the manufacturer of the robot used in the competition, Motoman SDA10F, manufactured by Yaskawa. Thanks to this video, Rutgers was loaned (for 6 months) a robot worth ~US$150,000.00.
 +
 
 +
[[Image:6.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/-Lz8SbGRlmw]]
 +
 
 +
 
 +
Video submitted to Amazon in a parallel competition to receive support to transport the robot to the competition (Rutgers won the support).
 +
 
 +
[[Image:7.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/H33EeTi3FaI]]
 +
 
 +
 
 +
More advanced version of the system running on Baxter and picking objects from the competition shelf.
 +
 
 +
[[Image:8.jpg|thumb|600px|centro|nenhum|link=http://youtu.be/kx6mFj9pjPU]]
 +
 
 +
 
 +
This vacuum gripper was designed by Professor Alberto, in collaboration with the company Unigripper (http://www.unigripper.com), specifically for the competition.
 +
 
 +
[[Arquivo:Photo_1.JPG|thumb|480px|centro|Hand-drawn sketch of the vacuum gripper design developed for the competition]]
 +
 
 +
 
 +
All points scored in the competition were achieved using it. The basic physical structure was implemented by the company Unigripper (http://www.unigripper.com). However, the electronic actuation, as well as the computational driver, were implemented by Prof. Alberto. Rutgers won Unigripper's support in a parallel competition to the Amazon Picking Challenge organized by the company.
 +
 
 +
[[Image:9.jpg|thumb|600px|centro|nenhum|link=https://www.youtube.com/watch?v=4hWTLhXAjyA]]
 +
 
 +
 
 +
In this video, Prof. Alberto presents the complete system he developed (including motion planning) running with the mapping, localization, and object detection components in the real world, but simulating the robot used in the competition.
 +
 
 +
[[Image:10.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/4__ainsD0QI]]
 +
 
 +
 
 +
This video presents the final version of the system used in the competition being employed on the competition robot. In the video, the Motoman SDA10F robot successfully performs one of the competition tasks.
 +
 
 +
[[Image:11.jpg|thumb|600px|centro|nenhum|link=https://www.youtube.com/watch?v=tofuV3d3XW0]]
 +
 
 +
 
 +
This video again presents the final version of the system used in the competition being employed on the competition robot. In the video, the Motoman SDA10F robot successfully performs several competition tasks:
 +
 
 +
[[Image:12.jpg|thumb|600px|centro|nenhum|link=https://www.youtube.com/watch?v=ag6aSwCUzMA]]
 +
 
 +
 
 +
The same video at 5x speed:
 +
 
 +
[[Image:13.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/8bXFiNWPc3E]]
 +
 
 +
 
 +
Robot only, 5x:
 +
 
 +
[[Image:14.jpg|thumb|600px|centro|nenhum|link=https://youtu.be/PhxOCKRejGM]]

Edição atual tal como às 01h15min de 12 de fevereiro de 2026

LCAD Professor Achieves 7th Place at the Amazon Picking Challenge

Photo-icra.jpg

Professor Alberto F. De Souza, coordinator of the High-Performance Computing Laboratory (LCAD), was part of a team that achieved 7th place at the Amazon Picking Challenge, an autonomous robotics competition organized by Amazon ([1]). As part of his postdoctoral activities with Prof. Kostas Bekris's group (https://pracsys.cs.rutgers.edu/) at Rutgers University (USA), Professor Alberto developed hardware and software for the Motoman robot (SDA10F, Yaskawa - https://www.motoman.com/en-us/products/robots/industrial/assembly-handling/sda-series/sda10f) using various technologies previously developed at LCAD. This result places LCAD, PPGI, and UFES among the top 10 teams in autonomous robotic manipulation worldwide.


Bloomberg article about the Amazon Picking Challenge featuring a video interview with Prof. Alberto. [2]

Video showing the final version of the system developed for the competition (5x speed for better visualization):

13.jpg


Robot only:

14.jpg

This video presents the initial system proposal for the robot, designed at the beginning of the project. The final system architecture used in the competition is a slightly improved version.

1.jpg


This video presents the preliminary mapping, localization, and object detection systems.

2.jpg


The object detector used is an improved version of the Linemod detector. This video shows a test performed after some improvements were made to the publicly available version of Linemod. The final version developed and used in the competition will be released as open-source software.

3.jpg


This video shows the Baxter robot performing all the main competition tasks. Rutgers already had this robot, which was used in the preliminary phase of the project. Professor Alberto was responsible for the decision-making, mapping, localization, and object detection subsystems, while Prof. Bekris's team was responsible for the robot motion planning (Prof. Bekris's team specialty). However, in this video, the robot motion planning system was also developed by Prof. Alberto based on the MoveIt! software, available as open source. Object detection was still enabled by a preliminary version of the system used in the competition.

4.jpg


Video submitted to Amazon in a parallel competition to determine who would receive a shelf and the objects to be used in the competition. In it, you can see Prof. Bekris's group motion planner being used in simulations.

5.jpg


Video submitted to the manufacturer of the robot used in the competition, Motoman SDA10F, manufactured by Yaskawa. Thanks to this video, Rutgers was loaned (for 6 months) a robot worth ~US$150,000.00.

6.jpg


Video submitted to Amazon in a parallel competition to receive support to transport the robot to the competition (Rutgers won the support).

7.jpg


More advanced version of the system running on Baxter and picking objects from the competition shelf.

8.jpg


This vacuum gripper was designed by Professor Alberto, in collaboration with the company Unigripper (http://www.unigripper.com), specifically for the competition.

Hand-drawn sketch of the vacuum gripper design developed for the competition


All points scored in the competition were achieved using it. The basic physical structure was implemented by the company Unigripper (http://www.unigripper.com). However, the electronic actuation, as well as the computational driver, were implemented by Prof. Alberto. Rutgers won Unigripper's support in a parallel competition to the Amazon Picking Challenge organized by the company.

9.jpg


In this video, Prof. Alberto presents the complete system he developed (including motion planning) running with the mapping, localization, and object detection components in the real world, but simulating the robot used in the competition.

10.jpg


This video presents the final version of the system used in the competition being employed on the competition robot. In the video, the Motoman SDA10F robot successfully performs one of the competition tasks.

11.jpg


This video again presents the final version of the system used in the competition being employed on the competition robot. In the video, the Motoman SDA10F robot successfully performs several competition tasks:

12.jpg


The same video at 5x speed:

13.jpg


Robot only, 5x:

14.jpg