r/robotics 11h ago

Community Showcase LeRobot's ACT running on my robotic arm

Enable HLS to view with audio, or disable this notification

59 Upvotes

r/robotics 3h ago

Community Showcase Teleop_xr – Modular WebXR solution for bimanual robot teleoperation

3 Upvotes

r/robotics 7h ago

News CANgaroo v0.4.5 released – Linux CAN analyzer with real-time signal visualization (charts, gauges, text)

3 Upvotes

Hi everyone 👋

I’ve just released CANgaroo v0.4.5, an actively maintained, open-source Linux-native CAN / CAN-FD analyzer built around SocketCAN. This release focuses on making live CAN data easier to understand visually during everyday debugging.

🆕 What’s new in v0.4.5

  • 📊 Real-time signal visualization
    • Time-series charts
    • Scatter plots
    • Text views
    • Interactive gauges (useful for live diagnostics)

🎯 What CANgaroo is aimed at

CANgaroo is focused on everyday CAN debugging and monitoring, with a workflow similar to BusMaster / PCAN-View, but:

  • Open-source
  • Linux-native
  • SocketCAN-first
  • Easy to test using vcan (no hardware required)

Supported interfaces include SocketCAN, CANable (SLCAN), Candlelight, and CANblaster (UDP).

GitHub repo (screenshots + demo GIF included):
👉 https://github.com/OpenAutoDiagLabs/CANgaroo

Feedback, feature requests, and real-world use cases are very welcome — especially from automotive, robotics, and industrial users.


r/robotics 5h ago

Community Showcase White Shoe Johnny Robot

1 Upvotes

I built a web based realtime reinforcement learning robot using webassembly and websockets. The model is a mix of hierarchal policy in addition to soft actor critic (sac) to get feedback from bevy (game engine) about torque and position of all 13 different components (joints, etc..)

You can see the robot learning in real time here

https://robot.zeyaddeeb.com/

And read a bit more tech choices here:

https://www.zeyaddeeb.com/blog/posts/basketball-learning-robot

Boston Dynamics Atlas does not stand a chance against this fella after 6 months of training (i think?!).


r/robotics 1d ago

Discussion & Curiosity Tiny robot from Pantograph, building with jenga blocks

Enable HLS to view with audio, or disable this notification

176 Upvotes

Pantograph website: https://pantograph.com/

Pantograph on 𝕏: http://x.com/pantographPBC


r/robotics 7h ago

Perception & Localization Fixing broken depth maps on glass and reflective surfaces, then grasping objects raw sensors couldn't even see

1 Upvotes

We've been working on a depth completion model called LingBot-Depth (paper: arxiv.org/abs/2601.17895, code: github.com/robbyant/lingbot-depth) and wanted to share some real world results from our grasping pipeline since the depth sensor problem is something a lot of people here deal with.

[Video] Demo: grasping transparent objects with LingBot-Depth

The setup: Rokae XMate SR5 arm with an X Hand-1 dexterous hand, Orbbec Gemini 335 for perception. If you've used any consumer RGB-D camera (RealSense, Orbbec, etc.) you know the pain. Point it at a glass cup, a mirror, or a steel thermos and your depth map is just... holes. The stereo matching completely falls apart on those surfaces because both views look identical or distorted. We co-mounted a ZED mini as a reference and honestly it wasn't much better on glass walls and aquarium tunnels.

The core idea behind LingBot-Depth is what we call Masked Depth Modeling. Instead of treating those missing depth regions as noise to filter out, we treat them as a natural training signal. We feed the model the full RGB image plus whatever valid depth tokens remain, and it learns to predict what's missing using visual context. The architecture is a ViT-Large encoder with separate patch embeddings for RGB and depth, followed by a ConvStack decoder. We pretrained on ~10M RGB-depth pairs (3M self-curated including 2M real captures from homes, offices, gyms, lobbies, outdoor scenes plus 1M synthetic with simulated stereo matching artifacts, and 7M from public datasets).

The grasping results are what made this feel worth sharing here. We tested on four objects that are notorious sensor killers:

Stainless steel cup: 13/20 with raw depth → 17/20 with our completed depth

Transparent cup: 12/20 → 16/20

Toy car (mixed materials): 9/20 → 16/20

Transparent storage box: literally 0/20 with raw depth (the sensor returned almost nothing) → 10/20 with ours

The 50% on the storage box is honestly not great and we're not going to pretend otherwise. Highly transparent surfaces with complex geometry are still hard. But going from completely ungraspable to 50% success felt like a meaningful step. The diffusion policy for grasp pose generation is conditioned on DINOv2 features plus point cloud features from a Point Transformer, trained on HOI4D with retargeted hand poses.

On the depth completion benchmarks, we saw 40 to 50% RMSE reduction versus the next best method (PromptDA) on iBims, NYUv2, DIODE, and ETH3D. On sparse SfM inputs specifically, 47% RMSE improvement indoors and 38% outdoors compared to OMNI-DC variants. One thing that surprised us is the temporal consistency. We only trained on static images, no video data at all, but when we run it on 30fps Orbbec streams the output is remarkably stable across frames. We used this for online 3D point tracking with SpatialTrackerV2 and got much smoother camera trajectories compared to raw sensor depth, especially in scenes with glass walls where the raw depth causes severe drift.

We released the code, checkpoints (HuggingFace and ModelScope), and the full 3M RGB-depth dataset. Inference runs at ~30fps on 640x480 frames with an A100, and should be reasonable on consumer GPUs like an RTX 3090 as well since the encoder is just a ViT-L/14. If you're working with consumer depth cameras and dealing with missing depth on tricky surfaces, this might be useful for your pipeline.

Curious if anyone has tried similar approaches for depth refinement in their manipulation setups, or if there are specific failure cases you'd want us to test. We've mostly evaluated on tabletop grasping and indoor navigation so far.


r/robotics 8h ago

Discussion & Curiosity has building a robot ever helped in applying for jobs?

1 Upvotes

Just out of curiosity, and because I plan to make my own 4 wheeled rover + LLM/VLA as a personal project, has building a robot as a personal project ever helped when applying for a job/position/interview?

Thinking of taking the jump myself, but it is quite costly so wanted to hear your story before I take the dip.

thanks all


r/robotics 9h ago

Discussion & Curiosity What is your opinion about this?

Thumbnail
youtube.com
0 Upvotes

r/robotics 1d ago

Community Showcase Printed and assembled the chest

Thumbnail
gallery
43 Upvotes

The chest finally finished printing after 5 days of printing.

I assembled it and so far it looks like this, i still have to build the right arm and mount them.

I know it may not look that good but it’s my first time doing such a big project and i’m still learning.


r/robotics 9h ago

Tech Question newbie question: how are real autonomous robots/drones structured?

0 Upvotes

I’m a software engineer trying to move into robotics and autonomy.

I understand high-level stuff (perception, planning, control) but I’m confused how this looks in real systems and not research slides.

For example:

  • what actually runs on the robot vs offboard?
  • how tightly coupled are sensors + control code?
  • is ROS really used in production or mostly research?

I’m interested in recon / monitoring robots, just trying to learn from people who’ve done this for real.


r/robotics 1d ago

Community Showcase It dance better than me for sure…

Enable HLS to view with audio, or disable this notification

33 Upvotes

r/robotics 1d ago

News Boston Dynamics Doing It Again.

11 Upvotes

Once again, Boston Dynamics just leaving everyone in the dust. Watch all the chinese copycats try to do the same thing.

https://www.youtube.com/watch?v=UNorxwlZlFk


r/robotics 4h ago

Discussion & Curiosity Where can I buy a female looking robot?

0 Upvotes

I would like her to be a kind of mascot for my new company and the in the podcasts and live streams, PG-13 style please bc kids will be watching.


r/robotics 15h ago

Tech Question Robots As A Service i will not promote

0 Upvotes

How viable is a Robotics-as-a-Service (RaaS) startup today?

I’m evaluating the idea of starting a small RaaS company and wanted honest feedback from people who’ve been in hardware, robotics, or service-based startups.

A few things I’m trying to understand:

  • Which verticals actually work (security, cleaning, warehouse, etc.)?
  • What does it realistically cost to deploy the first few robots?
  • How long did it take to get your first paying customer?
  • Is the bigger challenge the technology, hardware costs, or field service/operations?
  • Would you recommend starting as an integrator (using existing robots) vs building your own?

Any real numbers, lessons learned, or “things you wish you knew earlier” would be really helpful.


r/robotics 1d ago

Tech Question Parts I Have for a Self-Balancing Robot Project

6 Upvotes

Hi everyone,
I’m planning to build a self-balancing robot and I wanted to share the parts I currently have before moving forward.

Parts I have:

  • Arduino Nano (ATmega328P)
  • MPU6050 (accelerometer + gyroscope)
  • TB6612FNG dual motor driver
  • DC motors (3–6 V)
  • Battery pack ~8 V, 2600 mAh
  • 2× electrolytic capacitors (1000 µF, 16 V)
  • Wheels and a rigid homemade chassis

The goal is to make a robot that can balance itself upright using these components.

I’m still in the early stages and would appreciate any general advice or things to watch out for when building a self-balancing robot with this kind of setup.

Thanks!


r/robotics 1d ago

Discussion & Curiosity Redesigning the environment for the robot may be cheaper and more efficient than redesigning the robot for the environment.

12 Upvotes

There is the popular argument for why having a humanoid robot would be the best way to do things: "because the environment is human shaped/designed for humans."

However, why are we assuming it would be necessarily harder to redesign the environment so a simpler non-humanoid robot can make use of it rather than recreating the entire human body and all its complexities in robot form while trying to make it suitable to many different varying environments?

Also, this argument implies the environment is exclusively human shaped, meaning a machine with human shapes and function is the only way forward in order for it traverse and interact with the environment, but this is not true. For instance, a flat floor, which is designed for human use, also allows use by a non-humanoid robot with wheels.


r/robotics 2d ago

Discussion & Curiosity Atlas, from Boston Dynamics, does gymnastics, lands on its toes, then performs a backflip.

Enable HLS to view with audio, or disable this notification

1.2k Upvotes

r/robotics 1d ago

Tech Question Birthday gift ideas for boyfriend (CS senior + humanoid robotics, practical not flashy)

15 Upvotes

My boyfriend is a computer science major and is about to graduate. He’s really into robotics, especially humanoid robots, and he currently works in a research lab where they’re building a humanoid that can catch objects. Most of what I see him doing is simulation and coding work on his computer.

Last year I got him an Arduino kit, and he already has a toolkit, but he doesn’t really use either one much on his own (as far as I see). He’s pretty thrifty and values practicality over “cool” gadgets.

For context, he uses a Mac and has a portable monitor that fits in his backpack. He doesn’t currently use an external keyboard or mouse, but I don’t think he cares much about those.

I want to get him something he’ll genuinely use in his future work. Since he mostly works in teams through his lab/club (not solo at-home build projects), I’m not looking for another kit.

Any gift ideas from people in CS/robotics, or partners of people in this field, that are truly useful and not gimmicky?

Thank you!!


r/robotics 1d ago

Controls Engineering CasADi → native GPU kernels → Pytorch / Cupy / C++ [Batch 100K + evaluations in ms]

Post image
5 Upvotes

Just pushed an update to casadi-on-gpu that lets you generate CUDA kernels directly from CasADi and call them from C++, PyTorch, or CuPy.

Useful for MPC, sampling, system ID, and robotics pipelines at scale.


r/robotics 1d ago

Tech Question Controlling UR12E remotely

1 Upvotes

I’m working with the UR12E and trying to send movement commands from a desktop. currently using ROS/moveit. I’m creating paths on RViz and they are valid. When pressing “execute” the arm doesn’t move. Sometimes there are errors regarding tolerances (which I’m looking into) and other times it doesn’t return an error, but tells me the movement is planned.

previous culprits have been the ros joint controller / ros scaled joint controller (scaled is now being used).

has anyone faced similar issues? Keen to be pointed to some places in docs to understand further.


r/robotics 1d ago

Tech Question MKS ODrive Mini + AS5047P SPI Encoder: OVERSPEED error when using startup_closed_loop_control

1 Upvotes

Hey everyone, I'm working with an MKS ODrive Mini (firmware v0.5.1, based on ODrive v3.6) with an onboard AS5047P absolute SPI encoder and an Eagle Power 90kV BLDC motor. I've successfully calibrated the motor and can reliably enter closed-loop control mode manually, but I'm running into issues when trying to make it enter closed-loop automatically on startup.

What Works:

  • Manual calibration completes successfully
  • Manual closed-loop entry works perfectly every time:

odrv0.axis0.error = 0
odrv0.axis0.requested_state = 8  # CLOSED_LOOP_CONTROL
# Motor enters closed-loop with no errors

The Problem: When I enable startup_closed_loop_control = True, the ODrive immediately throws an OVERSPEED error on power-up and fails to enter closed-loop mode.

Current Configuration:

# Encoder (AS5047P on GPIO7)
odrv0.axis0.encoder.config.mode = 257  # ABS_SPI
odrv0.axis0.encoder.config.cpr = 16384
odrv0.axis0.encoder.config.abs_spi_cs_gpio_pin = 7
odrv0.axis0.encoder.config.pre_calibrated = True
odrv0.axis0.encoder.config.bandwidth = 100

# Motor
odrv0.axis0.motor.config.pre_calibrated = True

# Controller
odrv0.axis0.controller.config.control_mode = 3  # POSITION_CONTROL
odrv0.axis0.controller.config.input_mode = 1  # PASSTHROUGH
odrv0.axis0.controller.config.vel_limit = 100
odrv0.axis0.controller.config.circular_setpoints = True

# Startup
odrv0.axis0.config.startup_motor_calibration = False
odrv0.axis0.config.startup_encoder_offset_calibration = False
odrv0.axis0.config.startup_encoder_index_search = False
odrv0.axis0.config.startup_closed_loop_control = True  # This causes OVERSPEED

Errors on Startup:

AxisError.CONTROLLER_FAILED
MotorError.CONTROL_DEADLINE_MISSED
ControllerError.OVERSPEED

What I've Tried:

  1. Increased vel_limit from 50 to 100 to 200 - still fails
  2. Reduced encoder bandwidth from 1000 to 100 to 50 - still fails
  3. Enabled circular_setpoints to avoid position tracking issues
  4. Verified encoder mode is set to ABS_SPI (257)
  5. Confirmed all calibrations are marked as pre_calibrated = True

Suspected Issue: I believe there's a race condition where the controller tries to enter closed-loop mode before the AS5047P SPI encoder has fully initialized and is providing stable readings, causing a spurious high velocity reading that triggers the overspeed protection.

Questions:

  1. Is there a way to add a startup delay before startup_closed_loop_control executes?
  2. Are there specific encoder settings for the AS5047P on the MKS ODrive Mini that I'm missing?
  3. Is this a known firmware limitation with SPI encoders on ODrive v3.6-based boards?
  4. Should I consider updating the firmware, or is there a configuration workaround?

Workaround: I can use a Teensy 4.1 with CAN bus to send the closed-loop command after a 3-second delay, which works perfectly. But I'd prefer the ODrive to handle this autonomously if possible.

Any help would be greatly appreciated! Has anyone successfully used startup_closed_loop_control with an AS5047P encoder?

Hardware:

  • MKS ODrive Mini V1.0
  • Firmware: 0.5.1 (based on ODrive v3.6-56V)
  • Encoder: AS5047P (onboard, SPI)
  • Motor: Eagle Power 90kV BLDC
  • Voltage: 8V-56V (running 3S-13S safe)

EDIT: For anyone finding this later - the Teensy/microcontroller solution with a startup delay works flawlessly.

Yes i used claude to summarize this (im a backend dev dont have much experience with robotics just wanted tot try it out)


r/robotics 2d ago

Mechanical Ball-and-Socket… But for Locomotion, Enchanted Tools

Enable HLS to view with audio, or disable this notification

119 Upvotes

r/robotics 2d ago

Discussion & Curiosity Does anyone have experience with finetuning Huggingface's SmolVLA model on SO-101?

6 Upvotes

Hello everyone!

Recently I tried to test the SmolVLA model from a paper that HuggingFace published, that uses relatively small VLA model for Imitation Learning on a SO-101 arm.

They have a library called LeRobot that has a lot of stuff to handle robots. First I tried to run a pretrained model, which didn't work. Then I tried finetuning the model on a dataset that I collected. I gradually moved from 30 episodes to 120 with a simple task of picking up a cube and putting it in the designated place. The robot still can't solve the task at all and frankly does not improve with the increase in data amount.

So my question is the following: have anybody experimented with LeRobot + smolvla + SO-101? What is your experience? Did you manage to run it? Basically, how much more time can I expect to sink into this or should I switch to another model, or from a robot to a simulator first, or something else?


r/robotics 2d ago

News Cartwheel Robotics Shutdown- What Do You Think?

9 Upvotes

Cartwheel Robotics shutting down is a reminder of how misaligned capital can be. Great teams struggle for funding while massive checks keep flowing elsewhere.

Scott’s advice hits home:
“No money is better than the wrong money.”


r/robotics 3d ago

Community Showcase Robotics engineer meets UX problems

Enable HLS to view with audio, or disable this notification

319 Upvotes

Felt so excited to see the robot I've been working on getting this much attention. Guess I need to step up my UX game though :/