Although smartphone users have had the ability to quickly translate spoken words into nearly any modern language for years now, this feat has been quite tough to accomplish on small, memory-constrained microcontrollers. In response to this challenge, Hackster.io user Enzo decided to create a proof-of-concept project that demonstrated how an embedded device can determine the language currently being spoken without the need for an Internet connection.
This so-called “language detector” is based on an Arduino Nano 33 BLE Sense, which is connected to a common PCA9685 motor driver that is, in turn, attached to a set of three micro servo motors — all powered by a single 9V battery. Enzo created a dataset by recording three words: “oui” (French), “si” (Italian), and “yes” (English) for around 10 minutes each for a total of 30 minutes of sound files. He also added three minutes of random background noise to help distinguish between the target keywords and non-important words.
Once a model had been trained using Edge Impulse, Enzo exported it back onto his Nano 33 BLE Sense and wrote a small bit of code that reads audio from the microphone, classifies it, and determines which word is being spoken. Based on the result, the corresponding nation’s flag is raised to indicate the language.
You can see the project in action below and read more about it here on Hackster.io.
This tinyML system helps soothe your dog’s separation anxiety with sounds of your voice
Arduino Team — November 17th, 2021
Due to the ongoing pandemic, Nathaniel Felleke’s family dog, Clairette, had gotten used to having people around her all the time and thus developed separation anxiety when the family would leave the house. But thanks to some clever thinking, Felleke came up with the idea to automatically detect when his dog started to bark and play some sounds of his family speaking to calm her down.
In order to detect when the dog is barking, Felleke collected plenty of audio samples from Google’s Bark Audioset, his own samples, speech commands, and miscellaneous cat and dog noises to distinguish background audio from a bark. After passing them into Edge Impulse’s Studio and creating a keyword spotting model, he downloaded the resulting model, which was then loaded onto a Nano 33 BLE Sense. If a bark is detected using the BLE Sense’s onboard microphone, it toggles a pin high to alert a separate Arduino Nano that a random human speech sound needs to be played by sending a command an attached Music Maker Feather Board.
To see this project in action, you can watch Felleke’s demonstration video below. For the code and resulting tinyML model, his files are available here on GitHub.
Gamifying exercise allows people to become more motivated and participate more often in physical activities while also being distracted by doing something fun at the same time. This inspired a team of students from the Handong Global University in Pohang, South Korea to come up with a system, dubbed “Move!,” that uses a microcontroller to detect various gestures and perform certain actions in mobile games accordingly.
They started by collecting many different gesture samples from a Nano 33 BLE Sense, which is worn by a person on their wrist. This data was then used to train a TensorFlow Lite model that classifies the gesture and sends it via Bluetooth to the host phone running the app. Currently, the team’s mobile app contains three games that a player can choose from.
There is a dinosaur game that operates similarly to the offline dinosaur game in Google Chrome where the user must jump to avoid incoming obstacles. The jumping jack game alternates between different movements that are mirrored by the player in a certain amount of time. And finally, there is a boxing game where the player punches the air when commanded onscreen.
You can read more about Move! — which was one of the five winning projects in the TensorFlow Lite for Microcontrollers Challenge — here and view/download the code for both the BLE Sense and mobile app on GitHub.
Being able to add dynamic lighting and images that can synchronize with a dancer is important to many performances, which rely on both music and visual effects to create the show. Eduardo Padrón aimed to do exactly that by monitoring a performer’s moves with an accelerometer and triggering the appropriate AV experience based on the recognized movement.
Padrón’s system is designed around a Raspberry Pi 4 running an MQTT server for communication with auxiliary IoT boards. Movement data was collected via a Nano 33 BLE Sense and its onboard accelerometer to gather information and send it to a Google Colab environment. From here, a model was trained on these samples for 600 epochs, achieving an accuracy of around 91%. After deploying this model onto the Arduino, he was able to output the correct gesture over USB where it interacts with the running Python script. Once the gesture is received, the MQTT server publishes the message to any client devices such as an ESP8266 for lighting and plays an associated video or sound.
Getting in your daily exercise is vital to living a healthy life and having proper form when squatting can go a long way towards achieving that goal without causing joint pain from doing them incorrectly. The Squats Counter is a device worn around the thigh that utilizes machine learning and TensorFlow Lite to automatically track the user’s form and count how many squats have been performed.
Creator Manas Pange started his project by flashing the tf4micro-moition-kit code to a Nano 33 BLE Sense, which features an onboard three-axis accelerometer. From there, he opened the Tiny Motion Trainer Experiment by Google that connects to the Arduino over Bluetooth and captures many successive samples of motion. After gathering enough proper and improper form samples, Manas trained, tested, and deployed the resulting model to the board.
Every time a proper squad is performed, the counter ticks down by one until it reaches a predefined goal.
Use the Nano 33 BLE Sense’s IMU and gesture sensor to control a DJI Tello drone
Arduino Team — September 8th, 2021
Piloting a drone with something other than a set of virtual joysticks on a phone screen is exciting due to the endless possibilities. DJI’s Tello can do just this, as it has a simple Python API which allows for basic aspects to be controlled such as taking off, landing, and moving within a horizontal plane. Soham Chatterjee built a system that takes advantage of two sensors within the Arduino Nano 33 BLE Sense’s onboard suite, namely the APDS-9960 and LSM9DS1 IMU.
He started this endeavor by creating two simple programs that ran on the BLE Sense. The first initializes the APDS-9960 to detect gestures, which then sends strings like “detected DOWN gesture” via the USB port to a host machine. The second program checks if the IMU has gone over a certain threshold in a single direction and relays a corresponding string if it has.
A Raspberry Pi runs one of two Python scripts that essentially read the incoming data from the Arduino and converts it into movements. For example, a gesture in the ‘DOWN’ direction lands the Tello drone, whereas tilting the board forwards will move the drone forward 50cm. As an added safety feature, the drone automatically lands after 60 seconds, although the Python script can be modified to prevent this behavior.
Monitor the pH levels of a hydroponic plant’s water supply with Arduino and tinyML
Arduino Team — September 2nd, 2021
Many plants are notorious for how picky they are about their environmental conditions. Having the wrong temperature, humidity, soil type, and even elevation can produce devastating effects. But none are perhaps as important and overlooked as water/soil pH, which is a measure of how acidic and/or alkaline the growing medium is. In hydroponics, maintaining optimal growing conditions is how high yields can be ensured without becoming too wasteful. Janet N on Hackster had the idea of harnessing the powers of embedded machine learning to let her know when the water had become unacceptable for her plants.
The device uses an Arduino Nano 33 BLE Sense to continuously monitor the pH of the hydroponics water supply with a simple probe. This data was initially loaded into Edge Impulse’s Studio where it was split into features and then sent to both a Keras classification model and an anomaly detection model for training. After she was satisfied with the performance of both, they were deployed back onto the Arduino.
As the system checks the pH of the water, it aggregates the data and places it into a buffer for classification. If the value is higher than 7, the soil is too basic, and a yellow LED is turned on. If the soil is too acidic (below 4), a red LED is activated. And finally, a green LED lights up when the optimal pH of around 5 has been reached.
You can read more about the process of creating this project here on Hackster.io.
Predicting a lithium-ion battery’s life cycle with tinyML
Arduino Team — August 24th, 2021
Nothing is perhaps more frustrating than suddenly discovering your favorite battery-powered device has shut down due to a lack of charge, and because almost no one finds joy in calculating how long it will live based on current consumption levels/time used, there must be a better way. This problem is what inspired Manivannan S. to create a small project that can predict when a battery is about to go flat using the “magic” of machine learning and a voltage sensor.
The circuit for the project is quite basic, consisting of an Arduino Nano 33 BLE Sense, a 125 ohm rheostat, a voltage sensing module, and finally the rechargeable 18650 Li-ion cell. The discharge current of the battery was set at 1 ampere with the rheostat, at which time the voltage output was sampled continuously for 30 minutes at a rate of one reading per minute. This data was imported into Edge Impulse’s Studio and used to train a regression model that can predict the estimated voltage and therefore also the capacity remaining.
Once tested, the model proved very successful in determining the battery’s voltage after an hour of use, after which Manivannan went onto explain how this data could be further extrapolated to estimate the complete life cycle. By incorporating machine learning into smart battery technology, power management can become more approachable and increasingly efficient.
Python support for three of the hottest Arduino boards out there is now yours. Through our partnership with OpenMV, the Nano RP2040 Connect, Nano 33 BLE and Nano 33 BLE Sense can now be programmed with the popular MicroPython language. Which means you get OpenMV’s powerful computer vision and machine learning capabilities thrown in.
OpenMV IDE and MicroPython Editor
While you can’t use Python directly with the Arduino IDE, you can use the OpenMV editor, and its version of MicroPython. From the editor, you can install MicroPython and load your scripts directly to the supported Arduino boards.
MicroPython is a great implementation of the full Python programming language, designed to run on microcontrollers. There’s extensive documentation all across the web, which is another huge advantage of learning and using Python for your Arduino projects.
There are so many reasons to get excited about MicroPython for these new Arduino boards. To name a few…
OpenMV’s machine learning and computer vision tools.
Great for computer science education.
Easy for web developers and coders to switch from other platforms to Arduino.
Huge number of MicroPython libraries, tutorials, guides and support online.
Simple to upgrade hardware as project demands increase (eg, upgrade from a Nano RP2040 Connect to a Portenta H7).
There are also lots of Arduino + Python projects that have been posted over the years. Now you can add the Nano devices to those projects and expand on them with their new MicroPython capabilities.
Get Started with Python on Arduino
To help you get cracking, we’ve put together a few guides for each of the supported Arduino boards. The Portanta H7 already supports MicroPython, but we’ve included it below for the sake of completion.
If it’s the first time you’ve used Python on your Arduino board, you’ll need to follow a few steps to get everything working together. Depending on which board you’re using, you might need to update the bootloader to make it compatible with OpenMV. Then you can connect to the board to upload the latest firmware and make it compatible with the editor.
There are guides to take you through the process for each board, and it’s not a complex task. Once completed, your boards will be ready to program them using MicroPython.
These simple tutorials will get you moving quickly.
Furthermore, you can find a few examples of MicroPython scripts you can upload and run on the various boards, too. It’s a great way to test the Python waters with your Arduino boards, and pick up a couple of hints and tips on using the language.
If you’ve got any resources, hints or tips of your own when it comes to learning or using Python, please do share them with the community! We want to hear all about your experiences, and any projects you build using Arduino and Python together.
We’ll keep you updated as we add more documentation and tutorials for MicroPython over on Arduino Docs, so keep an eye out for those.
There are already countless projects that utilize individually addressable RGB LED strips in some way or another, except most of them lack a “wow” factor. This is one problem that Philipp Niedermayer’s Sphere2 Lamp does not suffer from, as it is a giant sphere comprised of 122 smaller domes (cut ping pong balls) that are each lit by their own LED. The project uses an Arduino Nano running code with the FastLED library to output signals via its GPIO pins to the LEDs. It is controlled over its serial interface by a Nano 33 BLE Sense since the latter has integrated BLE functionality.
Niedermayer also wrote a dedicated app for starting and stopping animations on the Sphere2 Lamp. The Android application features an interface that lets users control not only the selected color or colors, but the brightness and the speed at which the animation plays as well. Currently, the app has a set of around both ten animations and color palettes each, although this number can certainly be increased in the future.
The Sphere2 Lamp is an extremely unique-looking showcase of what is capable with just a couple of Arduino boards, some LED strips, and innovative programming. You can view the project’s write-up here on Hackster.io and see its code on GitHub.
Shortly after the COVID-19 pandemic began, Samuel Alexander and his housemates purchased a ping pong set and began to play — a lot. Becoming quite good at the game, Alexander realized that his style was not consistent with how more professional table tennis players hit the ball, as he simply taught himself without a coach. Because of this, he was inspired to create a smart paddle that uses an integrated IMU to intelligently classify which moves he makes and correct his form to improve it over time.
Alexander went with the Nano 33 BLE Sense board due to its ease of use and tight integration with TensorFlow Lite Micro, not to mention the onboard 6DOF accelerometer/ gyroscope module. He began by designing a small cap that fits over the bottom of a paddle’s handle and contains all the electronics and battery circuitry. With the hardware completed, it was time to get started with the software.
The Tiny Motion Trainer by Google Creative Lab was employed to quickly capture data from the Arduino over Bluetooth and store the samples for each motion. Once all of the movements had been gathered, Alexander trained the model for around 90 epochs and was able to achieve an impressive level of accuracy. His build log and demonstration video below shows how this smart paddle can be used to intelligently classify and coach a novice player into using better form while playing, and it will be fun to see just how good the model can get.
Snoring is an annoying problem that affects nearly half of all adults and can cause others to lose sleep. Additionally, the ailment can be a symptom of a more serious underlying condition, so being able to know exactly when it occurs could be lifesaving. To help solve this issue, Naveen built the Snoring Guardian — a device that can automatically detect when someone is snoring and begin to vibrate as an alert.
The Snoring Guardian features a Nano 33 BLE Sense to capture sound from its onboard microphone and determine if it constitutes a snore. He employed Edge Impulse along with the AudioSet dataset that contains hundreds or even thousands of labeled sound samples that can be used to train a TensorFlow Lite Micro model. The dataset within Edge Impulse was split between snoring and noise, with the latter label for filtering out external noise that is not a snore. With the spectrograms created and the model trained, Naveen deployed it to his Nano 33 BLE Sense as an Arduino library.
The program for the Snoring Guardian gathers new microphone data and passes it to the model for inference. If the resulting label is “snoring,” a small vibration motor is activated that can alert the wearer. As an added bonus, the entire thing runs off rechargeable LiPo batteries, making this an ultra-portable device. You can see a real-time demonstration below as well as read more about this project on Hackster.io.
Whether commuting to work or simply having fun around town, riding a bike can be a great way to get exercise while also enjoying the scenery. However, riding around on the road presents a danger as cars or other cyclists / pedestrians might not be paying attention while you try to turn. That is why Alvaro Gonzalez-Vila created VoiceTurn, a set of turn signals that are activated by simply saying which direction you are heading towards.
VoiceTurn works by using the Arduino Nano 33 BLE Sense at its heart to both listen for the “left” or “right” keywords and then activate the appropriate turn signal. Gonzalez-Vila took advantage of edge machine learning through the Edge Impulse Studio. First, he collected audio samples consisting of the words “left,” “right,” and then random noise via the Google Speech Commands Dataset. Next, he sent them through an MFCC block that does some processing to extract human speech features. And finally, the Keras neural network was trained on these features to produce a model.
With the model deployed to the Nano 33 BLE Sense, Gonzalez-Vila developed a simple program that continually reads in a waveform from the microphone and passes it to the model for inference. Based on the result, a string of NeoPixels on either the left or right will begin to light up for a predetermined number of cycles. As seen in his video below, the VoiceTurn works really well at detecting keywords and is easy to see from a distance. You can read more about how this project was built in its write-up here.
‘Droop, There It Is!’ is a smart irrigation system that uses ML to visually diagnose drought stress
Arduino Team — July 13th, 2021
Throughout the day as the sun evaporates the water from a plant’s leaves via a process called transpiration, observers will notice that they tend to get a little bit droopy. Also known as drought stress, this response to a loss of water results in low turgidity (internal water pressure) and can impact the ability of the plant to grow correctly. Traditional irrigation monitors use soil moisture sensors to determine the soil’s water levels, but Terry Rodriquez and Salma Mayorquin wanted to create something a bit more unique: a visual droop detection system.
Their device, which they affectionately call the “Droop, There It Is”, features a Nano 33 BLE Sense and ArduCam camera module to take pictures of the plant and uses an image classifier to determine if the plant is drooping or not. They started by taking a pre-trained MobileNetV2 base model and fine-tuned it with a set of 6,000 images. After optimizing the result with grayscale reductions and knowledge distillation techniques, the team deployed it onto their Nano 33 BLE Sense for inferencing.
Although the device only signals when the plant needs water over Bluetooth Low Energy for now, it can be augmented in the future to directly control pumps and valves if needed. This project is a great demonstration of how machine learning can be harnessed to reduce overwatering and increase efficiency. You can read more about it here or check out their video below!
A dangerous fall can happen to anyone, but they are particularly dangerous among the elderly as that demographic might not have effective ways to get help when needed. Rather than having to purchase an expensive device that costs up to $100 per month to use, Nathaniel F. on Hackster wanted to build a project that harnessed the power of embedded machine learning to detect falls and send an alert. His solution involves the Arduino Nano 33 BLE Sense board, which not only has an integrated accelerometer but also contains Bluetooth Low Energy capabilities that lets the processor communicate with the accompanying mobile app.
Nathaniel trained his ML model on the SmartFall dataset, which allows the device to respond to a wide variety of falls and ignore non-harmful movements. Once training was completed, he was able to achieve an accuracy of 95%. The Nano 33 BLE Sense samples accelerometer data at 31.25Hz to match the dataset’s frequency, and it makes a prediction every two seconds. If a fall is detected or the built-in emergency button was pressed, the user has 30 seconds to deactivate the alarm, otherwise it sends a BLE message to the phone which in turn sends an SMS message to an emergency contact containing the current location.
Even though this DIY fall detector works well already, Nathaniel plans on making a custom PCB and extending the battery life for longer use time between charging. You can read more about his design here, and you can view his demonstration video below.
One major drawback to the largescale farming of animals for meat consumption is the tendency for diseases to spread rapidly and decimate the population. This widespread issue is what drove Clinton Oduor to build a tinyML-powered device that can perform precision livestock farming tasks intelligently. His project works by continuously monitoring the noise coming from pigs and makes a determination about what they mean, such as if a cough is indicative of a respiratory illness or a squeal denoting stress.
Oduor gathered the sound samples for his dataset by downloading around seven minutes of coughing pig sounds and then split them up into one-second-long files. After using a trick called data curation that allows for more samples to be generated from previous ones, he trained a neural network with Edge Impulse and was able to achieve a 99.7% accuracy. As for deployment, the model runs on an Arduino Nano 33 BLE Sense, which has an onboard microphone for picking up ambient sounds. When coughing is detected, it sends some data via I2C to a MKR FOX 1200 board that broadcasts a message over the Sigfox network.
The developer plans on collecting more data from various pig species and at different stages of growth to further enhance the diversity of the model and increase its accuracy. As a more advanced challenge, he would also like to have his device recognize specific cough patterns for certain types of respiratory diseases. You can read more about his project here.
There are thousands of bird species in the world, with numerous different and unique ones living in various areas. Developers Errol Joshua, Mahesh Nayak, Ajith K J, and Supriya Nickam wanted to build a simple device that would allow them to automatically recognize the feathered friends near them and do some simple tracking, such as knowing how often a particular bird makes its call. Their project uses a Nano 33 BLE Sense, along with its onboard microphone, to pick up sounds and make inferences about what they are in real-time.
The team decided to train their tinyML model to detect four different species that are native to their area and then downloaded a sample dataset containing many sound files. After a bit of editing, they transferred the audio clips into Edge Impulse’s Studio and subsequently labeled each one. The Impulse consisted of a Mel-filter-bank energy (MFE) block that took the sounds and produced a spectrogram for each one. With these processed features, the model was able to achieve an impressive 95.9% accuracy.
As seen in their demonstration video below, the current bird sound being played was picked up and identified accurately by the Nano 33 BLE Sense. And with some minor changes to how the model was trained, the accuracy can be increased even more. You can read about this project on its page.
This pocket-sized uses tinyML to analyze a COVID-19 patient’s health conditions
Arduino Team — June 21st, 2021
In light of the ongoing COVID-19 pandemic, being able to quickly determine a person’s current health status is very important. This is why Manivannan S wanted to build his very own COVID Patient Health Assessment Device that could take several data points from various vitals and make a prediction about what they indicate. The pocket-sized system features a Nano 33 BLE Sense at its core, along with a Maxim Integrated MAX30102 pulse oximeter/heart-rate sensor to measure oxygen saturation and pulse.
From this incoming health data, Manivannan developed a simple algorithm that generates a “Health Index” score by plugging in factors such as SpO2, respiration rate, heart rate, and temperature into a linear regression. Once some sample data was created, he sent it to Edge Impulse and trained a model that uses a series of health indices to come up with a plausible patient condition.
After deploying the model to the Nano 33 BLE Sense, Manivannan put some test data on it to simulate a patient’s vital signs and see the resulting inferences. As expected, his model successfully identified each one and displayed it on an OLED screen. To read more about how this device works, plus a few potential upgrades, you can visit its write-up on Hackster.io here or check out the accompanying video below.
PUPPI is a tinyML device designed to interpret your dog’s mood via sound analysis
Arduino Team — June 18th, 2021
Dogs are not known to be the most advanced communicators, so figuring out what they want based on a few noises and pleading looks can be tough. This problem is what inspired a team of developers to come up with PUPPI — a small device that utilizes tinyML to interpret your canine companion’s mood through vocal signals. Their project employs an Arduino Nano 33 BLE Sense and its onboard microphone to both capture the data and run inferencing with the model they trained using Edge Impulse. After collecting ample amounts of data for barks, growls, whines, and other noises, their model achieved an accuracy of around 92%.
Once deployed to the physical device, the board continuously takes in new sound data and comes up with a prediction for what kind of noise it is. This data is then sent over Bluetooth Low Energy to an app that displays what the board is hearing, along with lighting up the onboard LED as a secondary indicator.
The PUPPI is a cool showcase of the power contained within edge ML devices, and it will be exciting to see increased granularity in the classifications as more data is added. You can read more about this project here on Hackster.io.
For the hearing impaired, communicating with others can be a real challenge, and this is especially problematic when it is a deaf parent trying to understand what their child needs, as the child is too young to learn sign language. Mithun Das was able to come up with a novel solution that combines a mobile app, machine learning, and a Neosensory Buzz wristband to enable this channel of communication.
Called the “Baby Connect”, Das’ system involves using a mobile app with a series of images that correspond to various feelings, actions, or wants/needs of a child. When something is requested, such as wanting to take a nap, the action is mapped to a sort of Morse code language that buzzes the four haptic motors on the Neosensory Buzz in a certain pattern. For instance, dislike is mapped to a dot, dash, and then dot, while yes is a single dot.
The Baby Connect also has some more advanced features including baby activity monitoring and environmental logging. Because deaf parents are unable to hear the difference between certain cries, the Nano 33 BLE Sense that controls the device runs a model trained with Edge Impulse that can distinguish between cries for pain, hunger, and general malaise. Finally, there’s the ability to use the app as a speech-to-text converter that takes words and changes them automatically into mapped vibrations.
Epilepsy can be a very terrifying and dangerous condition, as sufferers often experience seizures that can result in a lack of motor control and even consciousness, which is why one team of developers wanted to do something about it. They came up with a simple yet clever way to detect when someone is having a convulsive seizure and then send out an alert to a trusted person. The aptly named Epilet (Epilepsy + bracelet) system uses a Nano 33 BLE Sense along with its onboard accelerometer to continually read data and infer if the sensor is picking up unusual activity.
The Epilet was configured to leverage machine learning for seizure detection, trained using data captured from its accelerometer within Edge Impulse’s Studio. The team collected 30 samples each of both normal, everyday activities and seizures. From this, they trained a model that is able to correctly classify a seizure 97.8% of the time.
In addition to the physical device itself is an accompanying mobile app that handles the communication. When it receives seizure activity that lasts for at least 10 seconds from the Nano 33 BLE Sense, the app sends an SMS message to a contact of the user’s choice. The Epilet has a lot of potential to help people suffering from epilepsy, and it will be exciting to see what other features get added to it in the future.
Bike locks have not changed that much in the last few decades, even though our devices have gotten far smarter, so they seem in need of an update. Designed with this in mind, the TapLock is able to intelligently lock and unlock from either Bluetooth or taps on the enclosure. It uses a Nano 33 BLE Sense to detect tap patterns via an onboard accelerometer as well as BLE capabilities to communicate with the owner’s phone.
Because taps are not necessarily directional, the TapLock’s creators took an average of each accelerometer axis and charted the time between the peaks. After collecting a large sample of data, they used Edge Impulse to process the data and then train a model with an accuracy of 96.4%. This allows the owner to have some wiggle room when trying to lock or unlock the bike.
The team also developed a mobile app, which provides another way for the bike’s owner to lock or unlock the bike, along with some extra features too. After connecting to the TapLock, the app loads the previous state of the lock device and updates itself if needed. If the user wants to lock the bike, the app will send a “lock” command to the TapLock and store the current location to show on a map. This way the owner won’t forget where their bike is when trying to retrieve it.
Currently, the TapLock doesn’t have a physical locking mechanism, but the team states that one can be added and then electronically activated from one of the Nano 33 BLE Sense’s GPIO pins. You can see a demo of this project in the video below and read about it on Hackster.io.
Um dir ein optimales Erlebnis zu bieten, verwenden wir Technologien wie Cookies, um Geräteinformationen zu speichern und/oder darauf zuzugreifen. Wenn du diesen Technologien zustimmst, können wir Daten wie das Surfverhalten oder eindeutige IDs auf dieser Website verarbeiten. Wenn du deine Einwillligung nicht erteilst oder zurückziehst, können bestimmte Merkmale und Funktionen beeinträchtigt werden.
Funktional
Immer aktiv
Die technische Speicherung oder der Zugang ist unbedingt erforderlich für den rechtmäßigen Zweck, die Nutzung eines bestimmten Dienstes zu ermöglichen, der vom Teilnehmer oder Nutzer ausdrücklich gewünscht wird, oder für den alleinigen Zweck, die Übertragung einer Nachricht über ein elektronisches Kommunikationsnetz durchzuführen.
Vorlieben
Die technische Speicherung oder der Zugriff ist für den rechtmäßigen Zweck der Speicherung von Präferenzen erforderlich, die nicht vom Abonnenten oder Benutzer angefordert wurden.
Statistiken
Die technische Speicherung oder der Zugriff, der ausschließlich zu statistischen Zwecken erfolgt.Die technische Speicherung oder der Zugriff, der ausschließlich zu anonymen statistischen Zwecken verwendet wird. Ohne eine Vorladung, die freiwillige Zustimmung deines Internetdienstanbieters oder zusätzliche Aufzeichnungen von Dritten können die zu diesem Zweck gespeicherten oder abgerufenen Informationen allein in der Regel nicht dazu verwendet werden, dich zu identifizieren.
Marketing
Die technische Speicherung oder der Zugriff ist erforderlich, um Nutzerprofile zu erstellen, um Werbung zu versenden oder um den Nutzer auf einer Website oder über mehrere Websites hinweg zu ähnlichen Marketingzwecken zu verfolgen.