Laser Harp Sets the Tone

In many ways, living here in the future is quite exiting. We have access to the world’s information instantaneously and can get plenty of exciting tools and hardware delivered to our homes in ways that people in the past with only a Sears catalog could only dream of. Lasers are of course among the exciting hardware available, which can be purchased with extremely high power levels. Provided the proper safety precautions are taken, that can lead to some interesting builds like this laser harp which uses a 3W laser for its strings.

[Cybercraftics]’ musical instrument is using a single laser to generate seven harp strings, using a fast stepper motor to rotate a mirror to precise locations, generating the effect via persistence of vision. Although he originally planned to use one Arduino for this project, the precise timing needed to keep the strings in the right place was getting corrupted by adding MIDI and the other musical parts to the project, so he split those out to a second Arduino.

Although his first prototype worked, he did have to experiment with the sensors used to detect his hand position on the instrument quite a bit before getting good results. This is where the higher power laser came into play, as the lower-powered ones weren’t quite bright enough. He also uses a pair of white gloves which help illuminate a blocked laser. With most of the issues ironed out, [Cybercraftics] notes that there’s room for improvement but still has a working instrument that seems like a blast to play. If you’re still stuck in the past without easy access to lasers, though, it’s worth noting that there are plenty of other ways to build futuristic instruments as well.

Taming the Wobble: An Arduino Self-Balancing Bot

Getting a robot to stand on two wheels without tipping over involves a challenging dance with the laws of physics. Self-balancing robots are a great way to get into control systems, sensor fusion, and embedded programming. This build by [mircemk] shows how to make one with just a few common components, an Arduino, and a bit of patience fine-tuning the PID controller.

At the heart of the bot is the MPU6050 – a combo accelerometer/gyroscope sensor that keeps track of tilt and movement. An Arduino Uno takes this data, runs it through a PID loop, and commands an L298N motor driver to adjust the speed and direction of two DC motors. The power comes from two Li-ion batteries feeding everything with enough juice to keep it upright. The rest of the magic lies in the tuning.

PID (Proportional-Integral-Derivative) control is what makes the robot stay balanced. Kp (proportional gain) determines how aggressively the motors respond to tilting. Kd (derivative gain) dampens oscillations, and Ki (integral gain) helps correct slow drifts. Set them wrong, and your bot either wobbles like a confused penguin or falls flat on its face. A good trick is to start with only Kp, then slowly add Kd and Ki until it stabilizes. Then don’t forget to calibrate your MPU6050; each sensor has unique offsets that need to be compensated in the code.

Once dialed in, the result is a robot that looks like it defies gravity. Whether you’re hacking it for fun, turning it into a segway-like ride, or using it as a learning tool, a balancing bot is a great way to sharpen your control system skills. For more inspiration, check out this earlier attempt from 2022, or these self-balancing robots (one with a little work) from a year before that. You can read up on [mircemk]’s project details here.

Physical Computing Used to be a Thing

In the early 2000s, the idea that you could write programs on microcontrollers that did things in the physical world, like run motors or light up LEDs, was kind of new. At the time, most people thought of coding as stuff that stayed on the screen, or in cyberspace. This idea of writing code for physical gadgets was uncommon enough that it had a buzzword of its own: “physical computing”.

You never hear much about “physical computing” these days, but that’s not because the concept went away. Rather, it’s probably because it’s almost become the norm. I realized this as Tom Nardi and I were talking on the podcast about a number of apparently different trends that all point in the same direction.

We started off talking about the early days of the Arduino revolution. Sure, folks have been building hobby projects with microcontrollers built in before Arduino, but the combination of a standardized board, a wide-ranging software library, and abundant examples to learn from brought embedded programming to a much wider audience. And particularly, it brought this to an audience of beginners who were not only blinking an LED for the first time, but maybe even taking their first steps into coding. For many, the Arduino hello world was their coding hello world as well. These folks are “physical computing” natives.

Now, it’s to the point that when Arya goes to visit FOSDEM, an open-source software convention, there is hardware everywhere. Why? Because many successful software projects support open hardware, and many others run on it. People port their favorite programming languages to microcontroller platforms, and as they become more powerful, the lines between the “big” computers and the “micro” ones starts to blur.

And I think this is awesome. For one, it’s somehow more rewarding, when you’re just starting to learn to code, to see the letters you type cause something in the physical world to happen, even if it’s just blinking an LED. At the same time, everything has a microcontroller in it these days, and hacking on these devices is also another flavor of physical computing – there’s code in everything that you might think of as hardware. And with open licenses, everything being under version control, and more openness in open hardware than we’ve ever seen before, the open-source hardware world reflects the open-source software ethos.

Are we getting past the point where the hardware / software distinction is even worth making? And was “physical computing” just the buzzword for the final stages of blurring out those lines?

This article is part of the Hackaday.com newsletter, delivered every seven days for each of the last 200+ weeks. It also includes our favorite articles from the last seven days that you can see on the web version of the newsletter. Want this type of article to hit your inbox every Friday morning? You should sign up!

Customer feedback and contribution

Nick built his Nixie 6-digit clock from the kit. The enclosure and inner works, to make it function and look like a clock, is his own original creation, all nicely documented here.




The Arduino code is based on the example code, but he added extra features that are controlled via an ESP8266 over Serial, using an interesting ATMega + ESP8266 combo board. Please check it out.




Wise time with Arduino 31 Aug 14:58

Exporting KiCad PCB w/ silkscreen to Fusion 360

Here’s the process I’ve been using to add a silkscreen image to the exported STEP model from KiCad to Fusion 360. The steps are: In KiCad PCB: In Fusion 360:
Todbot 15 Jun 20:31

Adobe is updating its terms of service following a backlash over recent changes

Following customer outrage over its latest terms of service (ToS), Adobe is making updates to add more detail around areas like of AI and content ownership, the company said in a blog post. "Your content is yours and will never be used to train any generative AI tool," wrote head of product Scott Belsky and VP of legal and policy Dana Rao. 

Subscribers using products like Photoshop, Premiere Pro and Lightroom were incensed by new, vague language they interpreted to mean that Adobe could freely use their work to train the company's generative AI models. In other words, creators thought that Adobe could use AI to effectively rip off their work and then resell it. 

Other language was thought to mean that the company could actually take ownership of users' copyrighted material (understandably so, when you see it). 

None of that was accurate, Adobe said, noting that the new terms of use were put in place for its product improvement program and content moderation for legal reasons, mostly around CSAM. However, many users didn't see it that way and Belsky admitted that the company "could have been clearer" with the updated ToS.

"In a world where customers are anxious about how their data is used, and how generative AI models are trained, it is the responsibility of companies that host customer data and content to declare their policies not just publicly, but in their legally binding Terms of Use," Belsky said. 

To that end, the company promised to overhaul the ToS using "more plain language and examples to help customers understand what [ToS clauses] mean and why we have them," it wrote.

Adobe didn't help its own cause by releasing an update on June 6th with some minor changes to the same vague language as the original ToS and no sign of an apology. That only seemed to fuel the fire more, with subscribers to its Creative Cloud service threatening to quit en masse. 

In addition, Adobe claims that it only trains its Firefly system on Adobe Stock images. However, multiple artists have noted that their names are used as search terms in Adobe's stock footage site, as Creative Bloq reported. The results yield AI-generated art that occasionally mimics the artists' styles. 

Its latest post is more of a true mea culpa with a detailed explanation of what it plans to change. Along with the AI and copyright areas, the company emphasized that users can opt out of its product improvement programs and that it will more "narrowly tailor" licenses to the activities required. It added that it only scans data on the cloud and never looks at locally stored content. Finally, Adobe said it will be listening to customer feedback around the new changes.

This article originally appeared on Engadget at https://www.engadget.com/adobe-is-updating-its-terms-of-service-following-a-backlash-over-recent-changes-120044152.html?src=rss

The Morning After: Everything Apple announced at WWDC

Apple’s annual developer shindig kicked off with its traditional keynote outlining all the new tricks its products will soon do. There are big changes for iOS 18, iPadOS 18, macOS Sequoia and watchOS 11, not to mention visionOS 2. Some highlights include a standalone Passwords app, better health metrics on the Watch and Apple Intelligence, its own spin on AI. There’s more to learn about, so keep reading to learn all the biggest stories from the show.

— Dan Cooper

The biggest stories you might have missed

Blackmagic is developing a camera for immersive Apple Vision Pro videos

Yes, iOS 18 will include RCS support

Apple’s new AI-powered Siri can use apps for you

Apple may integrate Google’s Gemini AI into iOS in the future

iOS 18 embraces Apple Intelligence, deeper customization and a more useful Siri

macOS Sequoia will let you see your iPhone mirrored on your Mac’s screen

iPadOS 18 is getting a big boost with Apple Intelligence

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!

Apple’s first attempt at AI is Apple Intelligence

A for Apple… A for Artificial… I get it!

Apple

Apple has finally bowed to pressure, bringing AI to its devices in the form of Apple Intelligence, powered by OpenAI. The system will bolster Siri, offering its generative AI smarts to write emails, summarize news articles and offer finer-grain control of your apps. It’ll be interesting to see, given Apple’s long-held distaste for machine learning gimmicks, if this can win where Google and Microsoft have floundered.

Continue Reading.

Apple brings a full-featured Passwords app to the Mac, iPhone, iPad and Windows

Let’s see how third-party password managers respond.

Apple

Apple already has a dedicated password manager buried in its operating systems, but now it’ll be its own app. Passwords will act as a standalone password manager across every Apple platform and will even work on Windows via iCloud. Like iCloud Keychain, it’ll generate and record passwords to all of your sites and services, locking them behind biometric security.

Continue Reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-everything-apple-announced-at-wwdc-111550649.html?src=rss

Future of 3D Printing: Researchers Demonstrate Tiny Chip-Based 3D Printer

Future of 3D Printing: Researchers Demonstrate Tiny Chip-Based 3D Printer

Researchers from MIT and the University of Texas at Austin have demonstrated the first chip-based 3D printer, potentially revolutionizing the 3D printing landscape. At present it is a novel device, small enough to fit in the palm of a hand, features a single millimeter-scale photonic chip that directs reconfigurable beams of light into a resin, which then cures into a solid object when struck by the light. This new technique bypasses the bulky mechanical systems used in traditional 3D printers, making the technology more portable and accessible.

Staff Tue, 06/11/2024 - 16:29
Circuit Digest 11 Jun 11:59

Jabra updated its Elite earbuds with an LE Audio case, improved ANC and more

Jabra's Elite 10 and Elite 8 Active earbuds debuted in August, but the company isn't waiting for an annual update to unveil a second-generation model for both of those sets. Neither of them will look different, except for some new color options, but there are big upgrades to both. The company has taken this opportunity to make changes to noise cancellation, audio features, spatial sound and other areas.

First, both the new Elite 10 and Elite 8 Active will come with what Jabra calls "the world's first LE Audio smart case." This will allow you to plug the charging case into any USB-C or 3.5mm jack to wirelessly transmit sound to the earbuds. While in-flight entertainment might be a primary use case here, other possibilities abound, including audio from workout equipment, TVs and more. The company says the revamped cases are equipped with a new chip to transmit LE Audio with lower latency than similar options already on the market. Jabra also promises better overall sound quality when using the feature, including "Hi-Fi" playback.

Next, Jabra says it also improved the active noise cancellation (ANC) performance on both the new Elite 10 and Elite 8 Active. The company is promising to block "up to twice as much noise" as the previous generations. To do so, Jabra explains that it fine-tuned the internal feedback microphones to provide better noise blocking for mid- and low-frequency sounds. What's more, ANC algorithms have been updated to better utilize their adaptive capabilities, so the earbuds should handle things like airplane noise and the roar of the gym more effectively. 

Jabra also did some fine-tuning to its HearThrough mode. The ambient sound feature on both the new Elite 10 and Elite 8 Active has been tweaked for enhanced sound outdoors with a dedicated Natural HearThrough mode. This new setting offers increased wind noise reduction that's twice as effective as that of the previous generation, according to the company. Algorithms expand the frequency range of the regular HearThrough mode to make this possible.

Jabra Elite 8 Active (2nd gen)
Jabra

While the Elite 8 Active had Dolby Audio and and the Elite 10 offered Dolby Atmos with head tracking, Jabra says the second-generation models both offer improved tuning for spatial sound. The company explains that during testing, 95 percent of its "expert panel" preferred the new audio profile to that of the previous gen. Lastly, Jabra is promising improved call quality on both the new Elite 10 and Elite 8 Active thanks to updated noise-reduction algorithms that provide enhanced voice recognition in subpar environments. 

All of the other stats on both sets of earbuds are holdovers from the previous generation. That includes the IP68 rating on the Elite 8 Active (case is IP54) and the IP57 rating on the Elite 10 (no case rating). You can also still expect up to six hours of battery life with ANC on for the Elite 10 (27 hours total with the case) and up to eight house of noise-cancelling use on the Elite 8 Active (32 hours total with the case). Bluetooth multipoint connectivity is still here, as are Fast Pair, Swift Pair and Spotify Tap. The second-gen Elite 10 can also still connect directly to smartwatches, so long as they support HFP, A2DP and AVRCP Bluetooth profiles. 

The Elite 10 (2nd gen) will be available in titanium black, gloss black, brown, blue and white for $279. The Elite 8 Active (2nd gen) comes in navy, black, coral and olive green for $299. Both of those prices are $29 more than the first versions that debuted last year and these two upgraded models will be available mid-June.

This article originally appeared on Engadget at https://www.engadget.com/jabra-updated-its-elite-earbuds-with-an-le-audio-case-improved-anc-and-more-090046844.html?src=rss

Just Dance VR is coming to Meta Quest headsets in October

If you think Just Dance would be a great addition to your library of virtual reality games and experiences, then mark this date: October 15. 2024. Ubisoft is launching Just Dance VR: Welcome to Dancity that day for the Meta Quest 2, Meta Quest Pro and Meta Quest 3. You'll be able to customize your avatars for the game and choose your own body shape, facial expression, skin color, hair and outfit. Once you're done creating a virtual version of yourself, you can enter the Dancity social hub to meet other players. 

You'll also have your own "apartment" in game, where you can dance with up to six players or do other interactive activities with the group, like play basketball. The game will let you send emote stickers to players who aren't in your friends list, but you can do voice chats with dancers who are. Welcome to Dancity features 360-degree environments and what Ubisoft describes as an "all-new gameplay with two-hand scoring."

You'll be able to dance to 25 hit and original songs at launch, including Don't Stop Me Now by Queen, Bad Liar by Selena Gomez, Starships by Nicki Minaj and Call Me Maybe by Carly Rae Jepsen. As UploadVR notes, the game was supposed to be exclusively available to Pico headsets. However, after the ByteDance-owned company laid off a big portion of its workforce, Ubisoft started working with a new partner (Meta) to develop the game.

This article originally appeared on Engadget at https://www.engadget.com/just-dance-vr-is-coming-to-meta-quest-headsets-in-october-043151830.html?src=rss