WO2019158222A1 - Methods and user devices for determining wind speed - Google Patents
Methods and user devices for determining wind speed Download PDFInfo
- Publication number
- WO2019158222A1 WO2019158222A1 PCT/EP2018/054049 EP2018054049W WO2019158222A1 WO 2019158222 A1 WO2019158222 A1 WO 2019158222A1 EP 2018054049 W EP2018054049 W EP 2018054049W WO 2019158222 A1 WO2019158222 A1 WO 2019158222A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user device
- speed
- particle
- wind speed
- determined
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P5/00—Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft
- G01P5/18—Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft by measuring the time taken to traverse a fixed distance
- G01P5/20—Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft by measuring the time taken to traverse a fixed distance using particles entrained by a fluid stream
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P5/00—Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft
- G01P5/001—Full-field flow measurement, e.g. determining flow velocity and direction in a whole region at the same time, flow visualisation
Definitions
- the disclosure relates to a user device for determining wind speed, a method of determining wind speed using a user device, a corresponding computer program, and a corresponding computer program product.
- the large number of user devices e.g. such as phones and tablets
- One field in which collection of local data may be useful is in weather reporting and forecasting.
- Data collection from user devices may be used to improve weather reporting and to provide meteorological information that might be used to improve weather forecasting (e.g. by providing improved local boundary conditions for weather models).
- anemometer e.g. cup anemometers or vane anemometers
- cup anemometers or vane anemometers may be added to mobile phones or other user devices to enable the user device to measure wind speed.
- a user device for determining wind speed.
- the user device comprises a processor, a memory, an illumination source and an imaging module for capturing image data.
- the memory contains instructions executable by the processor to cause the user device to illuminate an airborne particle using the illumination source, record, using the imaging module, image data of the particle whilst the particle is illuminated by the illumination source and determine a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
- imaging modules such as cameras and videoing equipment found in user devices, such as mobile phones, smartphones, smartwatches, tablet computers, laptop computers or games consoles may be used in conjunction with timing information to image an airborne particle and compute the speed of movement of the airborne particle. In this way, wind speed may be inferred using a user device without the use of specialised hardware modules.
- a method of determining wind speed using a user device comprises illuminating an airborne particle using an illumination source of the user device and recording, using an imaging module of the user device, image data of the particle whilst the particle is illuminated by the illumination source.
- the method further comprises determining a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
- a computer program comprising instructions which, when executed on at least one processor, cause the at least one processor to carry out the method of the second aspect.
- a computer program product comprising a computer-readable medium with the computer program of the third aspect.
- the solutions herein provide low-cost, easily deployable ways to measure wind speed using a user device.
- the solutions herein therefore facilitate crowd-based measurements of wind-speed that can be used to improve meteorological data collection.
- Figure 1 shows an example user device for determining wind speed according to some embodiments herein
- Figures 2a and 2b illustrate how a user device may be used to determine wind speed according to some embodiments herein;
- Figure 3 shows a schematic illustrating how a user device may be used to determine wind speed according to some embodiments herein;
- Figure 4 shows a method for determining wind speed according to some embodiments herein.
- FIG. 1 shows an example user device 100 for determining wind speed according to embodiments herein.
- the user device 100 comprises a processor 102, a memory 104, an illumination source 106 and an imaging module 108 for capturing image data.
- the memory contains instructions executable by the processor to cause the user device to illuminate an airborne particle using the illumination source 106 and record, using the imaging module 108, image data of the particle whilst the particle is illuminated by the illumination source 106.
- the instructions when executed by the processor further cause the user device 100 to determine a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
- the user device is operative, configured or adapted to perform the steps described above.
- the user device 100 may comprise any type of user device, such as a consumer electronics device, or portable electronics device.
- the user device 100 may comprise a mobile phone, smartphone, smartwatch, tablet computer, laptop computer or games console.
- the user device 100 may comprise an unmanned aerial vehicle (UAV) such as a drone.
- UAV unmanned aerial vehicle
- the user device 100 may comprise a vehicular console such as an electronic console in a car or lorry. More generally the user device 100 may comprise any electronic user equipment comprising an imaging module 108 and an illumination source 106.
- the memory 104 may comprise instruction data representing instructions.
- the processor 102 may be configured to communicate with the memory 104 and to execute the instructions.
- the instructions when executed by the processor may cause the processor to send instructions to the illumination source 106 and/or the imaging module 108.
- the instructions when executed by the processor may also cause the processor to determine, calculate, or process in any other way, data related to the methods and processes here. In this way, the user device 100 may perform any of the embodiments of the methods described below, such as the method 400.
- the memory 104 may be configured to store the instruction data in the form of program code that can be executed by the processor 102.
- the instruction data can comprise a plurality of software and/or hardware modules that are each configured to perform, or are for performing, individual or multiple steps of the method described herein. In some embodiments all modules described herein are implemented as hardware and in others all of them are implemented through software.
- the memory 104 may be part of a device that also comprises one or more other components of the user device 100 (for example, the processor 102 and/or one or more other components of the user device 100). In alternative embodiments, the memory 104 may be part of a separate device to the other components of the user device 100.
- the memory 104 may comprise a plurality of sub- memories, each sub-memory being capable of storing a piece of instruction data.
- instruction data representing the instructions may be stored at a single sub-memory.
- instruction data representing the instructions may be stored at multiple sub-memories.
- the instruction data representing different instructions may be stored at one or more different locations in the user device 100.
- the memory 104 may be used to store information, such as image data or other data relevant to determinations made by the processor 102 of the user device 100 or from any other components of the user device 100.
- the processor 102 can comprise one or more processors, processing units, multi-core processors and/or modules that are configured or programmed to control the user device 100 in the manner described herein.
- the processor 102 may comprise a plurality of (for example, interoperated) processors, processing units, multi-core processors and/or modules configured for distributed processing. It will be appreciated by a person skilled in the art that such processors, processing units, multi-core processors and/or modules may be located in different locations and may perform different steps and/or different parts of a single step of the method described herein.
- the illumination source 106 may comprise any source of electromagnetic radiation.
- the illumination source may comprise a light-emitting diode (LED), light bulb, laser, infrared (IR) diode or any other light source.
- the illumination source 106 may emit optical electromagnetic radiation, infrared electromagnetic radiation, or electromagnetic radiation at any other frequency or in any other frequency band that can be used to illuminate an airborne particle.
- the illumination source 106 may provide periodic illumination or illumination for a fixed duration of time.
- the LED may comprise a pulse-width-modulation LED. Pulse-width- modulation LEDs operate at high frequency (e.g. repeatedly turn on and off), the frequency being high enough so that human eye cannot resolve the inherent flicker.
- the illumination source may provide periodic illumination (e.g. turn on and off) at a frequency, v, the periodic illumination comprising pulses having a pulse duration dtuium ⁇
- the imaging module 108 may comprise a camera, video equipment or any other equipment suitable for recording (e.g. taking or acquiring) image data.
- image data may comprise, for example, visual images, sequences of visual images, video recordings and/or live images (e.g. picture-movie mixtures, holding 10-100 ms of frames that can be used as a short animation) of an airborne particle.
- the imaging module 108 is suitable for recording images of particles at the wavelength of electromagnetic radiation emitted by the illumination source 106.
- the imaging module may record (or take) image data for a predefined period of time.
- the image module may have a shutter speed and the imaging module may record image data of the airborne particle during a duration of time dt re cord associated with the shutter speed.
- the user device 100 may further comprise additional components to those listed here.
- the user device 100 may comprise a communications interface, for example, for sending or receiving information over a wireless or wired connection, e.g., a cellular communications network, for example, such as a 2G, 3G, 4G or 5G cellular network, or a wireless local area network (WLAN)/Wi-Fi network.
- the user device may further comprise one or more user interfaces for receiving user input and/or displaying information to a user.
- the user device 100 may further comprise a keyboard, mouse, microphone, display screen, touchscreen, and/or speakers. Examples of other components that may be comprised in the user device 100 include a power source such as a battery or mains power connection.
- the instructions when executed by the processor cause the user device 100 to illuminate an airborne particle using the illumination source 106 (e.g. the instructions when executed by the processor may cause the processor to send an instruction to the illumination source to cause the illumination source to begin illuminating).
- an airborne particle may comprise any airborne particle small enough such that the speed of the airborne particle as it moves through the air is approximately equal to the wind speed (e.g. the airborne particle should generally follow the fluid flow dynamics of the air in which it is moving).
- airborne particles that might be imaged by the user device 100 include mist, water
- drops/droplets e.g. rain
- sand particles e.g. snowflakes
- ice crystals e.g. hail
- pollen leaves, pine needles, or small insects such as gnats.
- the user device 100 may therefore be operative to illuminate (e.g. shine a light on) such an airborne particle, as described above.
- the instructions when executed by the processor further cause the user device 100 to record, using the imaging module 108, image data of the particle whilst the particle is illuminated by the illumination source 106 (e.g. the instructions may cause the processor to send an instruction to the imaging module instructing the imaging module to begin recording).
- the instructions when executed by the processor further cause the user device 100 to determine a wind speed by calculating a speed of movement of the particle based on a change of position (e.g. the trajectory) of the particle in the image data and timing information related to the capture of the image data.
- the timing information may relate to a duration of time during which the change of position of the particle in the image data occurred.
- the timing information may comprise timing information relating to a duration of time in which the particle was recorded and/or a duration of time during which the particle was illuminated (such as the durations dtm um or d ord above).
- the timing information may comprise timing information associated with a (e.g. mechanical) function of the user device, for example, a shutter speed or exposure time of the imaging module as described below.
- the distance travelled by the particle may be determined from the change of position of the particle in the image (e.g. the particle trajectory in the image) as will be described below.
- the speed of the particle may be calculated, for example, from the relationship:
- a speed (distance travelled as indicated by the change of position of the particle in the image data)/(duration of time indicated or derived from the timing information). Generally, therefore, a speed may be calculated based on a change of position of the particle which is captured in the image data and a period of time during which the change of position is captured.
- speed is a scalar quantity that describes how quickly a particle is moving.
- Velocity is a vector quantity that describes both how fast a particle is moving and the direction of travel.
- the directional aspect of velocity may be captured by the direction of the velocity vector in the vector space.
- the speed may be calculated from the velocity vector as the length or norm (e.g. magnitude) of the velocity vector.
- the timing information may comprise, for example, one or more of: i) a shutter speed or exposure time of the imaging module, ii) a duration of time during which the particle is recorded by the imaging module, and iii) a duration of time during which the particle is illuminated by the illumination source.
- the user device may comprise an illumination source in the form of an LED (e.g. such as the LED on a mobile phone).
- the LED light may be switched on and off with a frequency, v, and consequently illuminate the particle during a duration of time tin um corresponding to 1/v.
- the imaging module on the user device may record image data during the interval tin um .
- the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to determine the wind speed based on a focal length of the imaging module used for recording the image data (e.g. the distance from the user device at which objects (such as particles) appear in focus.) For example, the actual distance travelled by the particle, as indicated by the change of position of the particle in the image data may be determined using the focal length of the image data.
- a focal length of the imaging module used for recording the image data e.g. the distance from the user device at which objects (such as particles) appear in focus.
- the focal length of the imaging module 108 may be set to a fixed focal length. This may enable particles at a specific distance from the camera lens (i.e. particles at the fixed focal length) to be considered.
- a fixed focal length may be selected in a range where the optical system of the imaging module provides optimal distance resolution. In the far field limit where focus length
- a focal length may be selected in a range where the imaging
- module 108 has the best depth granularity.
- a focal length may be selected (e.g. automatically) by the imaging module 108 of the user device 100. This may enable an“optimal” focal length to be selected by the imaging module 108, based on ambient conditions. It will be understood that this approach may comprise a standard camera operation procedure e.g. whereby a button is pressed, the camera estimates optima focal length and executes image caption. Generally, auto-focus of a user device may be achieved using processes such as Contrast Detection, Laser Autofocus, or Phase Detection.
- the focal length of an imaging module e.g. the distance from the user device at which objects (such as particles) appear in focus, may generally be determined, for example, using known commands on a user device.
- the distance from the user device at which objects (such as particles) appear in focus may generally be determined, for example, using known commands on a user device.
- getFocusDistances(float[] output) call available on some Android user devices may be used, which gets the distances from the camera to where an object appears to be in focus.
- Figure 2a shows a schematic illustrating how an example user device may be used to calculate wind speed according to some embodiments herein and illustrates some of the principles described above.
- Figure 2a shows a user device 100 comprising an imaging module 108 and an illumination source 106. In use, the imaging
- the module 108 captures image data within a field of view indicated by the arrow 202.
- the imaging module has a focal length 203 and a corresponding focal plane 204.
- the focal plane is defined as the plane in which an imaged particle (or any object) may be recorded by the imaging module in focus (e.g. a plane defined by the fact that all points in the plane lie at a distance from the imaging module corresponding to the focal length).
- the illumination module 106 illuminates the field of view 202
- the imaging module 108 records image data of a particle 206 passing through the focal plane 204 (defined by the focal length 203) in a direction 208.
- a processor (not illustrated in Figure 2a) in the user device 100 may process the recorded image data and determine a change of position of the particle in the image data.
- the skilled person will be familiar with image processing tools that may be used to determine the location of a particle in an image.
- open source libraries such as the Open Source Computer Vision Library (OpenCV) software development kit, suitable for use in Android, comprise commands such as SimpleBlobDetectorQ which may be used to detect‘blobs’ or groups of connected pixels such as the particles herein.
- the skilled person will be further familiar with methods to convert a distance travelled by a particle in an image into a physical distance travelled by the particle.
- camera control software may be used to determine how many pixels a particle trace consists of.
- the physical length of the particle trajectory (e.g. the actual distance travelled) may be determined.
- the actual distance travelled may be determined in a similar manner to the methodology used by known applications that implement automatic size measurement by measuring the number of pixels given a certain focal length, and given camera’s lens system, calculate object’s physical size.
- An example of such an application is, for example, the “Smart Measure” app by Smart Tools ® ).
- the skilled person will be familiar with trigonometric principles that may be used to determine a distance travelled from a change in position in an image and a focal length.
- the distance travelled by the particle may be computed from the focal length 203 and angle Q, as illustrated in Figure 2a.
- Timing information such as the duration of illumination of the particle 206 by the illumination source 106 or the duration of the recording made by the imaging module 108 (e.g. such as the length of a video recording or a shutter speed used by the imaging module) may be used in conjunction with the determined distance travelled by the particle to determine the speed of the particle. In this way, the speed of an airborne particle may be used to determine wind speed.
- the user device may further (e.g. alternatively or additionally) comprise a module for determining the inclination of the user device 100, such as an accelerometer and/or gyroscope.
- the inclination may be determined relative to any defined surface or plane, for example, relative to a horizontal plane (e.g. the ground), a vertical plane, relative to the forward pointing direction or focal plane of the imaging module 108.
- the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to determine an angle of inclination of the user device and determine a wind velocity based on the determined wind speed and the determined angle of inclination of the user device. In this way, directional information is obtained and this allows a velocity to be determined (e.g. a vector quantity describing both the speed and direction of the airborne particle).
- the user device 100 may comprise a compass module. The compass module may be configured to determine the orientation of the user device (e.g. which direction the user device, and thus the imaging module of the user device, is facing relative to the geographic directions, north, south, east or west).
- the compass module 1 10 may determine the orientation of the user device 100 using any appropriate method, for example, the compass module may comprise a magnetic compass. Alternatively or additionally, the compass module may comprise a global positioning system (GPS) and/or an accelerometer and the compass module may be operative to determine the orientation of the user device 1 10 based on a combination of acquired GPS data and
- GPS global positioning system
- the user device 100 may therefore be operative to determine, using such a compass module 110, an orientation of the user device, and determine a wind velocity based on the determined wind speed and the determined orientation of the user device.
- Figure 2b illustrates the same user device 100 as shown in Figure 2a, further comprising a compass module 110.
- the compass module 110 determines the orientation of the user device 100 (e.g. the direction in which the user device 100 is facing relative to due north) and thus is able to determine a velocity (or velocity component) of the particle.
- the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to determine wind speed based on a magnification setting of the imaging module used to record the image data.
- the speed of movement of the particle (and thus the wind speed), determined as described above, will reflect the“true” speed of the particle if the particle is moving in direction perpendicular to the imaging direction of the imaging module 108 (e.g. if the particle is moving through the focal plane 204 in Figure 2a).
- the determined speed of movement of the particle may comprise a projection of the“true” speed onto the focal plane 204 (e.g. the measured speed may reflect the component of the speed in the focal plane 204).
- the user device may need to be oriented such that the recorded particles move within the focal plane of the imaging module.
- the instructions when executed by the processor may further cause the user device 100 to record a first portion of image data of a first particle and record a second portion of image data of a second particle.
- the instructions when executed by the processor may then cause the user device 100 to determine a speed of movement of the first particle and a speed of movement of the second particle, and determine the wind speed based on the determined speeds of movement of the first and second particles.
- the first portion of image data may be recorded in a first direction and the second portion of image data may be recorded in a second direction (e.g. the imaging module/user device may be pointing in a first direction when the first portion of image data is recorded and a second direction when the second portion of image data is recorded.
- a user may have rotated the user device 100 between the recordings of the first portion of image data and the second portion of image data).
- a velocity field may be mapped, describing the wind speed (e.g. the projection of the true wind speed onto the focal plane of the imaging module) in multiple directions.
- Figure 3 shows a top view of a user device facing in two different orientations, illustrated by the boxes 100a and 100b respectively, in a wind field indicated by the arrows 302.
- a first portion of image data comprising a first particle is recorded whilst the user device 100 is pointing towards the direction“a”, as illustrated by device 100a
- a second portion of image data comprising a second particle is recorded whilst the user device 100 is pointing towards the direction“b”, as illustrated by device 100b.
- the device may be rotated (e.g. by a user) through the illustrated positions 100a and 100b.
- the measured speeds in the directions“a” and“b” may thus be combined to determine a more accurate wind speed.
- the user device may record third or subsequent portions of image data (for example in the directions c and d shown in Figure 3) and determine the wind speed (or a wind vector field) based on the determined speeds of movement of particles in the first, second, third and subsequent portions of image data.
- a full wind vector comprising the wind speed and direction (e.g. with respect to North- East-South-West) may be determined.
- the first and or second portions of image data may be portions of data recorded in the same direction (e.g. orientation. Determined speeds of movement of the first and second particles may then be combined to improve the accuracy of the determination of the wind speed. For example, generally, the determined speeds of movement of the first and second particles may be averaged to determine a more accurate measure of the wind speed.
- such an average may be improved, for example, by removing outlying speed determinations before determining an average speed.
- the average particle wind speed/direction vector fields of different particles in each portion of image data may be calculated (e.g. the processor 102 may calculate the average speed of a plurality of particles in the same image).
- an average particle wind speed/direction vector field over a plurality of portions of image data using the same focal length may be determined (e.g. the processor 102 may calculate the average speed of a plurality of particles in different images).
- an average particle wind speed/direction vector field may be determined over a plurality of portions of image data recorded at different focal lengths (e.g. the processor 102 may calculate the average speed of a plurality of particles recorded in different images, each image being recorded at a different focal length).
- the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to determine a speed of movement of the user device and determine a wind speed based on the calculated speed of movement of the particle and the determined speed of movement of the user device. For example, if the user device 100 is moving, then the wind speed (or velocity) may be determined by subtracting the speed (or velocity) of the user device from the determined speed of movement (or determined velocity) of the airborne particle. This may be useful, for example, in embodiments where the user device 100 comprises a UAV or comprises part of a moving vehicle (such as a console of a vehicle such as a car or lorry). Such user devices may comprise a plurality of imaging modules 108, pointing in different directions.
- first, second and subsequent portions of image data may be recorded simultaneously, each portion being recorded by a different one of the plurality of imaging modules.
- the wind speed/wind velocity vector may be determined as described above, without the need for rotation of the user device.
- the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to set a focal length of the imaging module based on the determined speed of movement of the user device.
- the user device may set a deeper focal length when the user device is travelling at a first speed and a shallower focal length when the user device is travelling at second speed, the first speed being faster than the second speed.
- the focal length of the imaging module of the user device 100 may be set so as to record more distant particles when the user device is moving at higher speeds and closer particles when the user device is moving at lower speeds.
- the focal length may thus be tuned to image the closest (and thus most clearly resolvable) particles, whilst ensuring that particles in near-body regions are avoided.
- the focal length may be set, for example, according to the speed of the user device 100 according to a relationship such as:
- Apply focal length sequence HIGH wherein thr_stationary, thr_intermediate_speed and thr_high_speed comprise threshold speeds indicating when the device is stationary, travelling at medium speed or high speed respectively“focal length sequence LOW’’ may comprise, for example, recording image data sequence of different relatively shallow (e.g. near-field) focal lengths “focal length sequence INTERM’’ may comprise, for example, recording more image data (e.g. of more particles) compared to when the device is stationary.
- Determined speeds of such particles may be averaged to improve the accuracy of the determined wind speed, as described above.
- a deeper (e.g. medium- field) focal length may be used, compared to the focal length of the LOW sequence.
- “focal length sequence HIGH’’ may comprise recording the greatest amount of image data at the deepest (e.g. far-field) focal lengths, compared to the INTERM or LOW sequences. Speeds of movement determined for different particles in different portions of image data recorded according to the HIGH focal length sequence may further be averaged, combined or aggregated (according to the details provided above) to improve the accuracy of the determinations.
- image data comprising near-body regions (e.g. where a user device’s motion induces higher air-flow velocity) may be avoided, whilst ensuring that the closest possible particles to the user device are used to determine the wind speed. In this way it may be ensured that the determined wind speed reflects the true speed of the wind, rather than effects caused by the passage of the user device through the flow.
- the determined wind speed may be taken into account in on-board route planning (for example, by a sat-nav system).
- the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to amend a route plan for the vehicle based on the determined wind speed and the determined speed of movement of the vehicle. For example, based on the determined wind speed and/or direction, the user device 100 may alter course or change speed.
- the UAV may be operative to adapt a flight control loop (e.g. to minimize flight drifting) according to the determined wind speed.
- a flight control loop e.g. to minimize flight drifting
- the UAV may further be additionally or alternatively operative to use the determined wind speed/direction relative to the UAV’s velocity vector (e.g. speed over ground) to continuously optimize route planning.
- the UAV may change direction or speed to reduce flight power consumption (e.g.
- the instructions when executed by the processor may additionally or alternatively cause the user device 100 to transmit (e.g. report) the determined wind speed (or velocity) to an external server or crowd-sourcing service.
- the user device may transmit wind speed (or velocity) data to the server or crowd-sourcing service each time the wind speed is measured (e.g. transmitted in real-time), or, for example periodically.
- the instructions when executed by the processor may additionally or alternatively cause the user device 100 to determine the wind speed periodically, on request by a user of the user device, or on request by the server or crowd-sourcing service (e.g. the user device may receive an instruction from the server or crowd-sourcing device instructing the user device to determine the wind speed).
- Such an external server or crown sourcing service may be used to build up real time weather maps based on local user data. It may further enable local wind statistics to be determined. Such measurements may further be aggregated, for example, wind speed measurements made by two or more nearby devices may be used to build up a wind speed map in that particular region. In some examples, image data recorded by different user devices in different directions may be merged (e.g. into a full 360 degree image sweep), thus enabling improved wind direction estimates to be made.
- information relating to the determined wind speed and/or direction may further be stored in the metadata of images for further use and applications. For example, in addition to geo position and time of capture which are routinely stored in image metadata, wind speed and direction may also be stored.
- the instructions when executed by the processor may additionally or alternatively cause the user device 100 to determine the number and/or physical properties (e.g. such as size, shape and color/reflectance) of particles in a field of view of the imaging module 108.
- the particles comprise e.g. snow, ice or water droplets
- such information may be used, for example, to estimate the amount of precipitation.
- the method 400 comprises illuminating an airborne particle using an illumination source of the user device.
- the method comprises recording, using an imaging module of the user device, image data of the particle whilst the particle is illuminated by the illumination source.
- the method 400 comprises determining a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
- the method 400 may be performed by a user device such as the user device 100.
- the method 400 may be performed by a server such as an external server associated with a crowd-sourcing service.
- an external server may instruct a user device or user devices to perform the method 400.
- the timing information may comprise one or more of: i) a shutter speed of the imaging module ii) a duration of time during which the particle is recorded by the imaging module, and iii) a duration of time during which the particle is illuminated by the illumination source. Details relating to the form and use of the timing information was described in detail above with respect to use device 100 and the details therein will be understood to apply equally to the method 400.
- determining 404 a wind speed further comprises determining a wind speed based on a focal length of the image data. Details relating to the form and use of the focal length of the image data was described above with respect to device 100 and the details therein will be understood to apply equally to block 404.
- the method 400 may further comprise determining an angle of inclination of the user device, and determining a wind velocity based on the determined wind speed and the determined angle of inclination of the user device.
- the method 400 may further comprise determining an orientation of the user device, and determining a wind velocity based on the determined wind speed and the determined orientation of the user device.
- the method 100 further comprises recording a first portion of image data of a first particle when the user device is pointing in a first direction, recording a second portion of image data of a second particle when the user device is pointing in a second direction, determining a speed of movement of the first particle and a speed of movement of the second particle, and determining the wind speed based on the determined speeds of movement of the first and second particles.
- the method 400 may comprise rotating the device between the first and second directions. Determining a wind speed based on first and second portions of image data comprising first and second particles was described above with respect to user device 100 and the details therein will be understood to apply equally to the method 400.
- the method 400 further comprises determining a speed of movement of the user device, and determining a wind speed based on the calculated speed of movement of the particle and the determined speed of movement of the user device.
- the method 400 may comprise setting a focal length of the imaging module of the user device based on the determined speed of movement of the user device. For example, setting a focal length may comprise setting a deeper focal length when the user device is travelling at a first speed and a shallower focal length when the user device is travelling at second speed, the first speed being faster than the second speed.
- the method 400 may further comprise amending a route plan for the vehicle based on the determined wind speed and the determined speed of movement of the vehicle.
- the method 400 may further comprise the user device transmitting the determined wind speed to an external server or crowd-sourcing service. Transmitting the wind speed to an external server or crowd-sourcing service in this way was described above with respect to the user device 100 and the details therein will be understood to apply equally to method 400.
- the method 400 may further comprise repeating the blocks of illuminating an airborne particle, recording image data and determining a wind speed, using one or more other user devices, and determining a wind speed based on the wind speed determined using the user device and the wind speeds determined using the one or more other user devices.
- each of the other devices may comprise a user device 100.
- the user device and the other user devices may be co-ordinated by an external server such as a server associated with crowd- sourcing as described above.
- the other user devices may record image data (comprising a particle) in different orientations compared to the user device and wind speeds measured from the recorded image data of the other devices may be aggregated to determine a more accurate wind speed or to determine directional information that may be used to determine a velocity or velocity field of the wind.
- the method 400 may further comprise acquiring location information relating to each of the one or more other devices, and determining a velocity field describing wind speed and wind direction based on the acquired location information, the wind speed determined using the user device and the wind speeds determined using the one or more other devices.
- there is a computer program comprising instructions which, when executed on at least one processor, cause the at least one processor to carry out the method 400.
- there is a computer program product comprising a computer-readable medium with the aforementioned computer program.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
Abstract
A user device for determining wind speed comprises a processor, a memory, an illumination source and an imaging module for capturing image data. The memory contains instructions executable by the processor to cause the user device to illuminate an airborne particle using the illumination source and record, using the imaging module, image data of the particle whilst the particle is illuminated by the illumination source. The user device is further caused to determine a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
Description
METHODS AND USER DEVICES FOR DETERMINING WIND SPEED
Technical Field
The disclosure relates to a user device for determining wind speed, a method of determining wind speed using a user device, a corresponding computer program, and a corresponding computer program product.
Background
The large number of user devices (e.g. such as phones and tablets) in use today provide unparalleled opportunities to collect and share data from a wide variety of locations. One field in which collection of local data may be useful is in weather reporting and forecasting. Data collection from user devices may be used to improve weather reporting and to provide meteorological information that might be used to improve weather forecasting (e.g. by providing improved local boundary conditions for weather models).
Currently user devices may be adapted to measure wind speed with additional dedicated hardware. For example anemometers (e.g. cup anemometers or vane anemometers) may be added to mobile phones or other user devices to enable the user device to measure wind speed.
Such solutions require dedicated hardware that has to be added to the user device which does not encourage spontaneous or automated collection of wind data and may be expensive. Furthermore, the additional hardware required means that very limited numbers of user devices are currently equipped to measure wind speed and thus crowd-based wind direction/speed information gathering (e.g. as input to any user- contributed weather service) is limited to a small user base.
It would therefore be advantageous to develop more user-friendly solutions to measure wind speed using user devices.
Summary of Invention
It is an object of the invention to provide an improved alternative to the above techniques and prior art. More specifically, it is an object of the invention to provide improved solutions for measuring wind speed using user devices.
Therefore, according to a first aspect there is provided a user device for determining wind speed. The user device comprises a processor, a memory, an illumination source and an imaging module for capturing image data. The memory contains instructions executable by the processor to cause the user device to illuminate
an airborne particle using the illumination source, record, using the imaging module, image data of the particle whilst the particle is illuminated by the illumination source and determine a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
The inventors herein have realised that imaging modules such as cameras and videoing equipment found in user devices, such as mobile phones, smartphones, smartwatches, tablet computers, laptop computers or games consoles may be used in conjunction with timing information to image an airborne particle and compute the speed of movement of the airborne particle. In this way, wind speed may be inferred using a user device without the use of specialised hardware modules.
According to a second aspect there is provided a method of determining wind speed using a user device. The method comprises illuminating an airborne particle using an illumination source of the user device and recording, using an imaging module of the user device, image data of the particle whilst the particle is illuminated by the illumination source. The method further comprises determining a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
According to a third aspect there is provided a computer program comprising instructions which, when executed on at least one processor, cause the at least one processor to carry out the method of the second aspect.
According to a fourth aspect there is provided a computer program product comprising a computer-readable medium with the computer program of the third aspect.
In this way, the solutions herein provide low-cost, easily deployable ways to measure wind speed using a user device. The solutions herein therefore facilitate crowd-based measurements of wind-speed that can be used to improve meteorological data collection.
Brief description of the drawings
For a better understanding and to show more clearly how embodiments herein may be carried into effect, reference will now be made, by way of example only, to the accompanying drawings, in which:
Figure 1 shows an example user device for determining wind speed according to some embodiments herein;
Figures 2a and 2b illustrate how a user device may be used to determine wind speed according to some embodiments herein;
Figure 3 shows a schematic illustrating how a user device may be used to determine wind speed according to some embodiments herein; and
Figure 4 shows a method for determining wind speed according to some embodiments herein.
Detailed Description
As described above, many user devices are currently unable to measure wind speed without the installation of additional dedicated hardware.
Figure 1 shows an example user device 100 for determining wind speed according to embodiments herein. The user device 100 comprises a processor 102, a memory 104, an illumination source 106 and an imaging module 108 for capturing image data. The memory contains instructions executable by the processor to cause the user device to illuminate an airborne particle using the illumination source 106 and record, using the imaging module 108, image data of the particle whilst the particle is illuminated by the illumination source 106. The instructions when executed by the processor further cause the user device 100 to determine a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
More generally, the user device is operative, configured or adapted to perform the steps described above.
The user device 100 may comprise any type of user device, such as a consumer electronics device, or portable electronics device. In some embodiments, the user device 100 may comprise a mobile phone, smartphone, smartwatch, tablet computer, laptop computer or games console. In some embodiments, the user device 100 may comprise an unmanned aerial vehicle (UAV) such as a drone. In some embodiments, the user device 100 may comprise a vehicular console such as an electronic console in a car or lorry. More generally the user device 100 may comprise any electronic user equipment comprising an imaging module 108 and an illumination source 106.
The memory 104 may comprise instruction data representing instructions. The processor 102 may be configured to communicate with the memory 104 and to execute the instructions. The instructions when executed by the processor may cause the processor to send instructions to the illumination source 106 and/or the imaging module 108. The instructions when executed by the processor may also cause the processor to determine, calculate, or process in any other way, data related to the
methods and processes here. In this way, the user device 100 may perform any of the embodiments of the methods described below, such as the method 400. The memory 104 may be configured to store the instruction data in the form of program code that can be executed by the processor 102.
In some implementations, the instruction data can comprise a plurality of software and/or hardware modules that are each configured to perform, or are for performing, individual or multiple steps of the method described herein. In some embodiments all modules described herein are implemented as hardware and in others all of them are implemented through software. In some embodiments, the memory 104 may be part of a device that also comprises one or more other components of the user device 100 (for example, the processor 102 and/or one or more other components of the user device 100). In alternative embodiments, the memory 104 may be part of a separate device to the other components of the user device 100.
In some embodiments, the memory 104 may comprise a plurality of sub- memories, each sub-memory being capable of storing a piece of instruction data. In some embodiments where the memory 104 comprises a plurality of sub-memories, instruction data representing the instructions may be stored at a single sub-memory. In other embodiments where the memory 104 comprises a plurality of sub-memories, instruction data representing the instructions may be stored at multiple sub-memories. Thus, according to some embodiments, the instruction data representing different instructions may be stored at one or more different locations in the user device 100. In some embodiments, the memory 104 may be used to store information, such as image data or other data relevant to determinations made by the processor 102 of the user device 100 or from any other components of the user device 100.
The processor 102 can comprise one or more processors, processing units, multi-core processors and/or modules that are configured or programmed to control the user device 100 in the manner described herein. In some implementations, for example, the processor 102 may comprise a plurality of (for example, interoperated) processors, processing units, multi-core processors and/or modules configured for distributed processing. It will be appreciated by a person skilled in the art that such processors, processing units, multi-core processors and/or modules may be located in different locations and may perform different steps and/or different parts of a single step of the method described herein.
The illumination source 106 may comprise any source of electromagnetic radiation. For example, the illumination source may comprise a light-emitting diode (LED), light bulb, laser, infrared (IR) diode or any other light source. The illumination
source 106 may emit optical electromagnetic radiation, infrared electromagnetic radiation, or electromagnetic radiation at any other frequency or in any other frequency band that can be used to illuminate an airborne particle.
The illumination source 106 may provide periodic illumination or illumination for a fixed duration of time. For example, in embodiments where the illumination source 106 comprises an LED, the LED may comprise a pulse-width-modulation LED. Pulse-width- modulation LEDs operate at high frequency (e.g. repeatedly turn on and off), the frequency being high enough so that human eye cannot resolve the inherent flicker.
Generally the illumination source may provide periodic illumination (e.g. turn on and off) at a frequency, v, the periodic illumination comprising pulses having a pulse duration dtuium·
The imaging module 108 may comprise a camera, video equipment or any other equipment suitable for recording (e.g. taking or acquiring) image data. In this sense, image data may comprise, for example, visual images, sequences of visual images, video recordings and/or live images (e.g. picture-movie mixtures, holding 10-100 ms of frames that can be used as a short animation) of an airborne particle. Generally, the imaging module 108 is suitable for recording images of particles at the wavelength of electromagnetic radiation emitted by the illumination source 106.
The imaging module may record (or take) image data for a predefined period of time. For example, the image module may have a shutter speed and the imaging module may record image data of the airborne particle during a duration of time dtrecord associated with the shutter speed.
The skilled person will appreciate that the user device 100 may further comprise additional components to those listed here. For example, the user device 100 may comprise a communications interface, for example, for sending or receiving information over a wireless or wired connection, e.g., a cellular communications network, for example, such as a 2G, 3G, 4G or 5G cellular network, or a wireless local area network (WLAN)/Wi-Fi network. The user device may further comprise one or more user interfaces for receiving user input and/or displaying information to a user. For example, the user device 100 may further comprise a keyboard, mouse, microphone, display screen, touchscreen, and/or speakers. Examples of other components that may be comprised in the user device 100 include a power source such as a battery or mains power connection.
As briefly noted above, the instructions when executed by the processor cause the user device 100 to illuminate an airborne particle using the illumination source 106 (e.g. the instructions when executed by the processor may cause the processor to
send an instruction to the illumination source to cause the illumination source to begin illuminating).
As used herein, an airborne particle may comprise any airborne particle small enough such that the speed of the airborne particle as it moves through the air is approximately equal to the wind speed (e.g. the airborne particle should generally follow the fluid flow dynamics of the air in which it is moving). Examples of airborne particles that might be imaged by the user device 100 include mist, water
drops/droplets (e.g. rain), sand particles, snowflakes, ice crystals (e.g. hail), pollen, leaves, pine needles, or small insects such as gnats.
The user device 100 may therefore be operative to illuminate (e.g. shine a light on) such an airborne particle, as described above.
The instructions when executed by the processor further cause the user device 100 to record, using the imaging module 108, image data of the particle whilst the particle is illuminated by the illumination source 106 (e.g. the instructions may cause the processor to send an instruction to the imaging module instructing the imaging module to begin recording). The instructions when executed by the processor further cause the user device 100 to determine a wind speed by calculating a speed of movement of the particle based on a change of position (e.g. the trajectory) of the particle in the image data and timing information related to the capture of the image data.
Generally, the timing information may relate to a duration of time during which the change of position of the particle in the image data occurred. For example, the timing information may comprise timing information relating to a duration of time in which the particle was recorded and/or a duration of time during which the particle was illuminated (such as the durations dtmum or d ord above). The timing information may comprise timing information associated with a (e.g. mechanical) function of the user device, for example, a shutter speed or exposure time of the imaging module as described below.
The distance travelled by the particle (e.g. the physical or actual distance) may be determined from the change of position of the particle in the image (e.g. the particle trajectory in the image) as will be described below. The speed of the particle may be calculated, for example, from the relationship:
speed = (distance travelled as indicated by the change of position of the particle in the image data)/(duration of time indicated or derived from the timing information).
Generally, therefore, a speed may be calculated based on a change of position of the particle which is captured in the image data and a period of time during which the change of position is captured.
The skilled person will appreciate that speed is a scalar quantity that describes how quickly a particle is moving. Velocity is a vector quantity that describes both how fast a particle is moving and the direction of travel. The directional aspect of velocity may be captured by the direction of the velocity vector in the vector space. The speed may be calculated from the velocity vector as the length or norm (e.g. magnitude) of the velocity vector.
In more detail, the timing information may comprise, for example, one or more of: i) a shutter speed or exposure time of the imaging module, ii) a duration of time during which the particle is recorded by the imaging module, and iii) a duration of time during which the particle is illuminated by the illumination source.
As an example, in some embodiments, the user device may comprise an illumination source in the form of an LED (e.g. such as the LED on a mobile phone). In such embodiments, the LED light may be switched on and off with a frequency, v, and consequently illuminate the particle during a duration of time tinum corresponding to 1/v. In such embodiments, the imaging module on the user device may record image data during the interval tinum. The processor may determine a distance travelled by the particle based on a change in position of the particle in the image data (as will be described below) and calculate the speed of movement of the particle according to speed=distance travelled/tnium.
In some embodiments, the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to determine the wind speed based on a focal length of the imaging module used for recording the image data (e.g. the distance from the user device at which objects (such as particles) appear in focus.) For example, the actual distance travelled by the particle, as indicated by the change of position of the particle in the image data may be determined using the focal length of the image data.
In some embodiments, the focal length of the imaging module 108 may be set to a fixed focal length. This may enable particles at a specific distance from the camera lens (i.e. particles at the fixed focal length) to be considered. For example, a fixed focal length may be selected in a range where the optical system of the imaging module provides optimal distance resolution. In the far field limit where focus length
approaches infinity, the distance resolution may become relatively poor; hence in some
embodiments, a focal length may be selected in a range where the imaging
module 108 has the best depth granularity.
Alternatively or additionally, a focal length may be selected (e.g. automatically) by the imaging module 108 of the user device 100. This may enable an“optimal” focal length to be selected by the imaging module 108, based on ambient conditions. It will be understood that this approach may comprise a standard camera operation procedure e.g. whereby a button is pressed, the camera estimates optima focal length and executes image caption. Generally, auto-focus of a user device may be achieved using processes such as Contrast Detection, Laser Autofocus, or Phase Detection.
The focal length of an imaging module, e.g. the distance from the user device at which objects (such as particles) appear in focus, may generally be determined, for example, using known commands on a user device. As an example, the
getFocusDistances(float[] output) call available on some Android user devices may be used, which gets the distances from the camera to where an object appears to be in focus.
Figure 2a shows a schematic illustrating how an example user device may be used to calculate wind speed according to some embodiments herein and illustrates some of the principles described above. Figure 2a shows a user device 100 comprising an imaging module 108 and an illumination source 106. In use, the imaging
module 108 captures image data within a field of view indicated by the arrow 202. The imaging module has a focal length 203 and a corresponding focal plane 204. The focal plane is defined as the plane in which an imaged particle (or any object) may be recorded by the imaging module in focus (e.g. a plane defined by the fact that all points in the plane lie at a distance from the imaging module corresponding to the focal length). When the illumination module 106 illuminates the field of view 202, the imaging module 108 records image data of a particle 206 passing through the focal plane 204 (defined by the focal length 203) in a direction 208. The skilled person will appreciate that many airborne particles may pass through the field of view 202 of the imaging module 108, however only those particle(s) 206 passing through the focal plane 204 will be in focus and therefore visible in the image data recorded by the imaging module 108. As such, it may be assumed that an illuminated particle captured by the imaging module resides in the focal plane of the image data.
A processor (not illustrated in Figure 2a) in the user device 100 may process the recorded image data and determine a change of position of the particle in the image data. The skilled person will be familiar with image processing tools that may be used to determine the location of a particle in an image. For example, open source libraries
such as the Open Source Computer Vision Library (OpenCV) software development kit, suitable for use in Android, comprise commands such as SimpleBlobDetectorQ which may be used to detect‘blobs’ or groups of connected pixels such as the particles herein.
The skilled person will be further familiar with methods to convert a distance travelled by a particle in an image into a physical distance travelled by the particle. For example, camera control software may be used to determine how many pixels a particle trace consists of. Combined with the focal length, the physical length of the particle trajectory, (e.g. the actual distance travelled) may be determined. (Note: the actual distance travelled may be determined in a similar manner to the methodology used by known applications that implement automatic size measurement by measuring the number of pixels given a certain focal length, and given camera’s lens system, calculate object’s physical size. An example of such an application is, for example, the “Smart Measure" app by Smart Tools®).
The skilled person will be familiar with trigonometric principles that may be used to determine a distance travelled from a change in position in an image and a focal length. As an example, the distance travelled by the particle may be computed from the focal length 203 and angle Q, as illustrated in Figure 2a.
Timing information, such as the duration of illumination of the particle 206 by the illumination source 106 or the duration of the recording made by the imaging module 108 (e.g. such as the length of a video recording or a shutter speed used by the imaging module) may be used in conjunction with the determined distance travelled by the particle to determine the speed of the particle. In this way, the speed of an airborne particle may be used to determine wind speed.
Turning back to Figure 1 , in some embodiments, the user device may further (e.g. alternatively or additionally) comprise a module for determining the inclination of the user device 100, such as an accelerometer and/or gyroscope. The inclination may be determined relative to any defined surface or plane, for example, relative to a horizontal plane (e.g. the ground), a vertical plane, relative to the forward pointing direction or focal plane of the imaging module 108.
The instructions when executed by the processor may additionally or alternatively further cause the user device 100 to determine an angle of inclination of the user device and determine a wind velocity based on the determined wind speed and the determined angle of inclination of the user device. In this way, directional information is obtained and this allows a velocity to be determined (e.g. a vector quantity describing both the speed and direction of the airborne particle).
In some embodiments, additionally or alternatively to the options above, the user device 100 may comprise a compass module. The compass module may be configured to determine the orientation of the user device (e.g. which direction the user device, and thus the imaging module of the user device, is facing relative to the geographic directions, north, south, east or west). The compass module 1 10 may determine the orientation of the user device 100 using any appropriate method, for example, the compass module may comprise a magnetic compass. Alternatively or additionally, the compass module may comprise a global positioning system (GPS) and/or an accelerometer and the compass module may be operative to determine the orientation of the user device 1 10 based on a combination of acquired GPS data and
accelerometer data.
The user device 100 may therefore be operative to determine, using such a compass module 110, an orientation of the user device, and determine a wind velocity based on the determined wind speed and the determined orientation of the user device.
This is illustrated in Figure 2b which illustrates the same user device 100 as shown in Figure 2a, further comprising a compass module 110. The compass module 110 determines the orientation of the user device 100 (e.g. the direction in which the user device 100 is facing relative to due north) and thus is able to determine a velocity (or velocity component) of the particle.
The skilled person will appreciate that the details described above are merely examples and that other factors may be used in addition or alternatively to those listed above, for example, the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to determine wind speed based on a magnification setting of the imaging module used to record the image data.
Generally, the speed of movement of the particle (and thus the wind speed), determined as described above, will reflect the“true” speed of the particle if the particle is moving in direction perpendicular to the imaging direction of the imaging module 108 (e.g. if the particle is moving through the focal plane 204 in Figure 2a). In the case that the particle is moving at an angle to the focal plane 204, then the determined speed of movement of the particle may comprise a projection of the“true” speed onto the focal plane 204 (e.g. the measured speed may reflect the component of the speed in the focal plane 204). In order to determine the“true” speed, ideally, the user device may need to be oriented such that the recorded particles move within the focal plane of the imaging module.
Therefore, in some embodiments, alternatively or additionally to the options described above, the instructions when executed by the processor may further cause the user device 100 to record a first portion of image data of a first particle and record a second portion of image data of a second particle. The instructions when executed by the processor may then cause the user device 100 to determine a speed of movement of the first particle and a speed of movement of the second particle, and determine the wind speed based on the determined speeds of movement of the first and second particles.
For example, the first portion of image data may be recorded in a first direction and the second portion of image data may be recorded in a second direction (e.g. the imaging module/user device may be pointing in a first direction when the first portion of image data is recorded and a second direction when the second portion of image data is recorded. For example, a user may have rotated the user device 100 between the recordings of the first portion of image data and the second portion of image data). In this way, (for example, in conjunction with orientation information as described above with respect to the compass module) a velocity field may be mapped, describing the wind speed (e.g. the projection of the true wind speed onto the focal plane of the imaging module) in multiple directions.
This is illustrated in Figure 3 which shows a top view of a user device facing in two different orientations, illustrated by the boxes 100a and 100b respectively, in a wind field indicated by the arrows 302. In this embodiment, a first portion of image data comprising a first particle is recorded whilst the user device 100 is pointing towards the direction“a”, as illustrated by device 100a, and a second portion of image data comprising a second particle is recorded whilst the user device 100 is pointing towards the direction“b”, as illustrated by device 100b. The device may be rotated (e.g. by a user) through the illustrated positions 100a and 100b. The measured speeds in the directions“a” and“b” may thus be combined to determine a more accurate wind speed.
The skilled person will appreciate that more than two portions of image data may be recorded, for example, the user device may record third or subsequent portions of image data (for example in the directions c and d shown in Figure 3) and determine the wind speed (or a wind vector field) based on the determined speeds of movement of particles in the first, second, third and subsequent portions of image data.
In some embodiments, the largest determined wind speed may be assumed to be representative of the actual wind speed e.g. it may be assumed that the largest determined wind speed was measured in a direction whereby the focal plane of the
imaging module coincides with (e.g. is parallel or approximately parallel with) the direction of the wind.
In examples where the user device is turned through a full turn (360 degrees) and where a plurality of image data is recorded, each piece of image data comprising a different particle, the particle direction can be deducted uniquely. In combination with a compass module on the user device (or similar input from external device or server), a full wind vector comprising the wind speed and direction (e.g. with respect to North- East-South-West) may be determined.
In some embodiments, alternatively or additionally, the first and or second portions of image data may be portions of data recorded in the same direction (e.g. orientation. Determined speeds of movement of the first and second particles may then be combined to improve the accuracy of the determination of the wind speed. For example, generally, the determined speeds of movement of the first and second particles may be averaged to determine a more accurate measure of the wind speed.
In some examples, such an average may be improved, for example, by removing outlying speed determinations before determining an average speed.
For example, the average particle wind speed/direction vector fields of different particles in each portion of image data may be calculated (e.g. the processor 102 may calculate the average speed of a plurality of particles in the same image).
As another example, an average particle wind speed/direction vector field over a plurality of portions of image data using the same focal length may be determined (e.g. the processor 102 may calculate the average speed of a plurality of particles in different images).
As another example, an average particle wind speed/direction vector field may be determined over a plurality of portions of image data recorded at different focal lengths (e.g. the processor 102 may calculate the average speed of a plurality of particles recorded in different images, each image being recorded at a different focal length).
Turning now to other embodiments, in some embodiments the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to determine a speed of movement of the user device and determine a wind speed based on the calculated speed of movement of the particle and the determined speed of movement of the user device. For example, if the user device 100 is moving, then the wind speed (or velocity) may be determined by subtracting the speed (or velocity) of the user device from the determined speed of movement (or determined velocity) of the airborne particle.
This may be useful, for example, in embodiments where the user device 100 comprises a UAV or comprises part of a moving vehicle (such as a console of a vehicle such as a car or lorry). Such user devices may comprise a plurality of imaging modules 108, pointing in different directions. As such, first, second and subsequent portions of image data (as described above) may be recorded simultaneously, each portion being recorded by a different one of the plurality of imaging modules. In this way, the wind speed/wind velocity vector may be determined as described above, without the need for rotation of the user device.
As will be familiar to the skilled person, moving objects such as moving vehicles may distort surrounding air as they pass though. Therefore, in some embodiments, the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to set a focal length of the imaging module based on the determined speed of movement of the user device.
For example, the user device may set a deeper focal length when the user device is travelling at a first speed and a shallower focal length when the user device is travelling at second speed, the first speed being faster than the second speed. In other words, the focal length of the imaging module of the user device 100 may be set so as to record more distant particles when the user device is moving at higher speeds and closer particles when the user device is moving at lower speeds. The focal length may thus be tuned to image the closest (and thus most clearly resolvable) particles, whilst ensuring that particles in near-body regions are avoided.
As another example, more generally the focal length may be characterised as a function of the speed of the user device 100 (e.g. focal length = f (speed of user device). The focal length may be set, for example, according to the speed of the user device 100 according to a relationship such as:
/'. (User device speed < thr_stationary)
Apply focal length sequence LOW
/'/. (User device speed > thr_stationary) OR (Vehicle speed >
thr_intermediate_speed)
Apply focal length sequence INTERM
/'//. (User device > thr_high_speed)
Apply focal length sequence HIGH wherein thr_stationary, thr_intermediate_speed and thr_high_speed comprise threshold speeds indicating when the device is stationary, travelling at medium speed or high speed respectively“focal length sequence LOW’’ may comprise, for example,
recording image data sequence of different relatively shallow (e.g. near-field) focal lengths “focal length sequence INTERM’’ may comprise, for example, recording more image data (e.g. of more particles) compared to when the device is stationary.
Determined speeds of such particles may be averaged to improve the accuracy of the determined wind speed, as described above. Furthermore, a deeper (e.g. medium- field) focal length may be used, compared to the focal length of the LOW sequence. “focal length sequence HIGH’’ may comprise recording the greatest amount of image data at the deepest (e.g. far-field) focal lengths, compared to the INTERM or LOW sequences. Speeds of movement determined for different particles in different portions of image data recorded according to the HIGH focal length sequence may further be averaged, combined or aggregated (according to the details provided above) to improve the accuracy of the determinations.
In this way, image data comprising near-body regions (e.g. where a user device’s motion induces higher air-flow velocity) may be avoided, whilst ensuring that the closest possible particles to the user device are used to determine the wind speed. In this way it may be ensured that the determined wind speed reflects the true speed of the wind, rather than effects caused by the passage of the user device through the flow.
In embodiments, as described above, where the user device is moving, for example, because the user device comprises a UAV or other vehicle, the determined wind speed may be taken into account in on-board route planning (for example, by a sat-nav system). In some embodiments therefore, the instructions when executed by the processor may additionally or alternatively further cause the user device 100 to amend a route plan for the vehicle based on the determined wind speed and the determined speed of movement of the vehicle. For example, based on the determined wind speed and/or direction, the user device 100 may alter course or change speed.
In embodiments where the user device 100 comprises a UAV, the UAV may be operative to adapt a flight control loop (e.g. to minimize flight drifting) according to the determined wind speed. Such as UAV may further be additionally or alternatively operative to use the determined wind speed/direction relative to the UAV’s velocity vector (e.g. speed over ground) to continuously optimize route planning. For example, the UAV may change direction or speed to reduce flight power consumption (e.g.
battery usage) over an intended flight route in view of the current wind situation.
Turning now to other embodiments, in some embodiments, the instructions when executed by the processor may additionally or alternatively cause the user device 100 to transmit (e.g. report) the determined wind speed (or velocity) to an external server or
crowd-sourcing service. The user device may transmit wind speed (or velocity) data to the server or crowd-sourcing service each time the wind speed is measured (e.g. transmitted in real-time), or, for example periodically. The instructions when executed by the processor may additionally or alternatively cause the user device 100 to determine the wind speed periodically, on request by a user of the user device, or on request by the server or crowd-sourcing service (e.g. the user device may receive an instruction from the server or crowd-sourcing device instructing the user device to determine the wind speed).
Such an external server or crown sourcing service may be used to build up real time weather maps based on local user data. It may further enable local wind statistics to be determined. Such measurements may further be aggregated, for example, wind speed measurements made by two or more nearby devices may be used to build up a wind speed map in that particular region. In some examples, image data recorded by different user devices in different directions may be merged (e.g. into a full 360 degree image sweep), thus enabling improved wind direction estimates to be made.
In some embodiments, information relating to the determined wind speed and/or direction may further be stored in the metadata of images for further use and applications. For example, in addition to geo position and time of capture which are routinely stored in image metadata, wind speed and direction may also be stored.
In some embodiments, the instructions when executed by the processor may additionally or alternatively cause the user device 100 to determine the number and/or physical properties (e.g. such as size, shape and color/reflectance) of particles in a field of view of the imaging module 108. Where the particles comprise e.g. snow, ice or water droplets, such information may be used, for example, to estimate the amount of precipitation.
Turning now to Figure 4 there is a method 400 of determining wind speed using a user device such as the user device 100 described above. In a first block 402, the method 400 comprises illuminating an airborne particle using an illumination source of the user device. In a second block 404, the method comprises recording, using an imaging module of the user device, image data of the particle whilst the particle is illuminated by the illumination source. In a third block 406, the method 400 comprises determining a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
In some embodiments, the method 400 may be performed by a user device such as the user device 100. In some embodiments, the method 400 may be performed by a
server such as an external server associated with a crowd-sourcing service. For example, an external server may instruct a user device or user devices to perform the method 400.
In some embodiments, the timing information may comprise one or more of: i) a shutter speed of the imaging module ii) a duration of time during which the particle is recorded by the imaging module, and iii) a duration of time during which the particle is illuminated by the illumination source. Details relating to the form and use of the timing information was described in detail above with respect to use device 100 and the details therein will be understood to apply equally to the method 400.
In some embodiments, determining 404 a wind speed further comprises determining a wind speed based on a focal length of the image data. Details relating to the form and use of the focal length of the image data was described above with respect to device 100 and the details therein will be understood to apply equally to block 404.
In some embodiments, the method 400 may further comprise determining an angle of inclination of the user device, and determining a wind velocity based on the determined wind speed and the determined angle of inclination of the user device.
In some embodiments, the method 400 may further comprise determining an orientation of the user device, and determining a wind velocity based on the determined wind speed and the determined orientation of the user device.
Details relating to determining an angle of inclination or orientation of the user device and determining a wind velocity based on the determined wind speed and inclination/orientation were described above with respect to the device 100 and the details therein will be understood to apply equally to method 400.
In some embodiments, the method 100 further comprises recording a first portion of image data of a first particle when the user device is pointing in a first direction, recording a second portion of image data of a second particle when the user device is pointing in a second direction, determining a speed of movement of the first particle and a speed of movement of the second particle, and determining the wind speed based on the determined speeds of movement of the first and second particles. For example, the method 400 may comprise rotating the device between the first and second directions. Determining a wind speed based on first and second portions of image data comprising first and second particles was described above with respect to user device 100 and the details therein will be understood to apply equally to the method 400.
In some embodiments, the method 400 further comprises determining a speed of movement of the user device, and determining a wind speed based on the calculated speed of movement of the particle and the determined speed of movement of the user device. As described above with respect to the device 100, the method 400 may comprise setting a focal length of the imaging module of the user device based on the determined speed of movement of the user device. For example, setting a focal length may comprise setting a deeper focal length when the user device is travelling at a first speed and a shallower focal length when the user device is travelling at second speed, the first speed being faster than the second speed.
In some embodiments, wherein the user device comprises a vehicle, the method 400 may further comprise amending a route plan for the vehicle based on the determined wind speed and the determined speed of movement of the vehicle.
Amending a route plan in this way was described above with respect to the user device 100 and the details therein will be understood to apply equally to method 400.
In some embodiments, the method 400 may further comprise the user device transmitting the determined wind speed to an external server or crowd-sourcing service. Transmitting the wind speed to an external server or crowd-sourcing service in this way was described above with respect to the user device 100 and the details therein will be understood to apply equally to method 400.
In some embodiments, the method 400 may further comprise repeating the blocks of illuminating an airborne particle, recording image data and determining a wind speed, using one or more other user devices, and determining a wind speed based on the wind speed determined using the user device and the wind speeds determined using the one or more other user devices. For example, each of the other devices may comprise a user device 100. For example, the user device and the other user devices may be co-ordinated by an external server such as a server associated with crowd- sourcing as described above. As described above, with respect to the user device 100, the other user devices may record image data (comprising a particle) in different orientations compared to the user device and wind speeds measured from the recorded image data of the other devices may be aggregated to determine a more accurate wind speed or to determine directional information that may be used to determine a velocity or velocity field of the wind.
As such, in some embodiments, the method 400 may further comprise acquiring location information relating to each of the one or more other devices, and determining a velocity field describing wind speed and wind direction based on the acquired
location information, the wind speed determined using the user device and the wind speeds determined using the one or more other devices.
In some embodiments, there is a computer program comprising instructions which, when executed on at least one processor, cause the at least one processor to carry out the method 400. In some embodiments, there is a computer program product comprising a computer-readable medium with the aforementioned computer program.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. The word“comprising” does not exclude the presence of elements or steps other than those listed in a claim,“a” or“an” does not exclude a plurality, and a single feature or other unit may fulfil the functions of several units recited in the claims. Any reference numerals or labels in the claims shall not be construed so as to limit their scope.
Claims
1. A user device for determining wind speed, the user device comprising:
a processor (102);
a memory (104);
an illumination source (106); and
an imaging module (108) for capturing image data, whereby the memory contains instructions executable by the processor to cause the user device to:
illuminate an airborne particle using the illumination source;
record, using the imaging module, image data of the particle whilst the particle is illuminated by the illumination source; and
determine a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
2. A user device as in claim 1 , wherein the timing information comprises one or more of:
a shutter speed or exposure time of the imaging module;
a duration of time during which the particle is recorded by the imaging module; and
a duration of time during which the particle is illuminated by the illumination source.
3. A user device as in claim 1 or 2, wherein the instructions when executed by the processor further cause the user device to determine wind speed based on a focal length of the imaging module used for recording the image data.
4. A user device as in any one of claims 1 to 3, further comprising an accelerometer and/or gyroscope and wherein the instructions when executed by the processor further cause the user device to:
determine an angle of inclination of the user device; and
determine a wind velocity based on the determined wind speed and the determined angle of inclination of the user device.
5. A user device as in any one of claims 1 to 4, further comprising a compass module (110) and wherein the user device being caused to determine a wind speed further comprises the user device being caused to:
determine, using the compass, an orientation of the user device; and
determine a wind velocity based on the determined wind speed and the determined orientation of the user device.
6. A user device as in any one of claims 1 to 5, wherein the instructions when executed by the processor cause the user device to:
record a first portion of image data of a first particle,
record a second portion of image data of a second particle;
determine a speed of movement of the first particle and a speed of movement of the second particle; and
determine the wind speed based on the determined speeds of movement of the first and second particles.
7. A user device as in any one of claims 1 to 6, wherein the instructions when executed by the processor further cause the user device to:
determine a speed of movement of the user device; and
determine a wind speed based on the calculated speed of movement of the particle and the determined speed of movement of the user device.
8. A user device as in claim 7, wherein the instructions when executed by the processor further cause the user device to:
set a focal length of the imaging module based on the determined speed of movement of the user device.
9. A user device as in claim 8, wherein the instructions when executed by the processor further cause the user device to set a deeper focal length when the user device is travelling at a first speed and a shallower focal length when the user device is travelling at second speed, the first speed being faster than the second speed.
10. A user device as in any one of claims 7 to 9, wherein the user device comprises a vehicle and the instructions when executed by the processor further cause the user device to:
amend a route plan for the vehicle based on the determined wind speed and the determined speed of movement of the vehicle.
1 1. A user device as in any one of claims 1 to 10, wherein the instructions when executed by the processor further cause the user device to:
transmit the determined wind speed to an external server or crowd-sourcing service.
12. A method of determining wind speed using a user device, the method comprising:
illuminating (402) an airborne particle using an illumination source of the user device;
recording (404), using an imaging module of the user device, image data of the particle whilst the particle is illuminated by the illumination source; and
determining (406) a wind speed by calculating a speed of movement of the particle based on a change of position of the particle in the image data and timing information related to the capture of the image data.
13. A method as in claim 12, wherein the timing information comprises one or more of:
a shutter speed or exposure time of the imaging module;
a duration of time during which the particle is recorded by the imaging module; and
a duration of time during which the particle is illuminated by the illumination source.
14. A method as in claim 12 or 13, wherein determining 406 a wind speed further comprises determining a wind speed based on a focal length of the imaging module used for recording the image data.
15. A method as in any one of claims 12 to 14, further comprising:
determining an angle of inclination of the user device; and
determining a wind velocity based on the determined wind speed and the determined angle of inclination of the user device.
16. A method as in any one of claims 12 to 15, further comprising
determining an orientation of the user device; and
determining a wind velocity based on the determined wind speed and the determined orientation of the user device.
17. A method as in any one of claims 12 to 16, further comprising:
recording a first portion of image data of a first particle when the user device is pointing in a first direction,
recording a second portion of image data of a second particle when the user device is pointing in a second direction;
determining a speed of movement of the first particle and a speed of movement of the second particle; and
determining the wind speed based on the determined speeds of movement of the first and second particles.
18. A method as in any one of claims 12 to 17, further comprising:
determining a speed of movement of the user device; and
determining a wind speed based on the calculated speed of movement of the particle and the determined speed of movement of the user device.
19. A method as in claim 18, further comprising:
setting a focal length of the imaging module of the user device based on the determined speed of movement of the user device.
20. A method as in claim 19, wherein setting a focal length comprises:
setting a deeper focal length when the user device is travelling at a first speed and a shallower focal length when the user device is travelling at second speed, the first speed being faster than the second speed.
21. A method as in any one of claims 18 to 20, wherein the user device comprises a vehicle, the method further comprising:
amending a route plan for the vehicle based on the determined wind speed and the determined speed of movement of the vehicle.
22. A method as in any one of claims 12 to 21 , further comprising the user device transmitting the determined wind speed to an external server or crowd-sourcing service.
23. A method as in any one of claims 12 to 22, further comprising:
repeating the steps of illuminating (402) an airborne particle, recording (404) image data and determining (406) a wind speed, using one or more other user devices; and
determining a wind speed based on the wind speed determined using the user device and the wind speeds determined using the one or more other user devices.
24. A method as in claim 23, further comprising:
acquiring location information relating to each of the one or more other devices; and
determining a velocity field describing wind speed and wind direction based on the acquired location information, the wind speed determined using the user device and the wind speeds determined using the one or more other devices.
25. A computer program comprising instructions which, when executed on at least one processor, cause the at least one processor to carry out the method according to any one of claims 12 to 24.
26. A computer program product comprising a computer-readable medium with the computer program as claimed in claim 25.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2018/054049 WO2019158222A1 (en) | 2018-02-19 | 2018-02-19 | Methods and user devices for determining wind speed |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2018/054049 WO2019158222A1 (en) | 2018-02-19 | 2018-02-19 | Methods and user devices for determining wind speed |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019158222A1 true WO2019158222A1 (en) | 2019-08-22 |
Family
ID=61581239
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2018/054049 WO2019158222A1 (en) | 2018-02-19 | 2018-02-19 | Methods and user devices for determining wind speed |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019158222A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11402855B2 (en) * | 2017-07-21 | 2022-08-02 | Nec Corporation | Processing device, drive control device, data processing method, and storage medium for attitude control of moving body based on wind conditions |
US11415592B2 (en) * | 2019-06-20 | 2022-08-16 | James Eagleman | Computing device and related methods for determining wind speed values from local atmospheric events |
US20240019876A1 (en) * | 2022-07-15 | 2024-01-18 | Wing Aviation Llc | Tether-Based Wind Estimation |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060175561A1 (en) * | 2005-02-09 | 2006-08-10 | Innovative Scientific Solutions, Inc. | Particle shadow velocimetry |
EP1736783A1 (en) * | 2004-03-31 | 2006-12-27 | The Tokyo Electric Power Company Incorporated | Fluid measuring system and long focal point optical system |
EP3018483A1 (en) * | 2014-11-07 | 2016-05-11 | photrack AG | Method and system for determining the velocity and level of a moving fluid surface |
-
2018
- 2018-02-19 WO PCT/EP2018/054049 patent/WO2019158222A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1736783A1 (en) * | 2004-03-31 | 2006-12-27 | The Tokyo Electric Power Company Incorporated | Fluid measuring system and long focal point optical system |
US20060175561A1 (en) * | 2005-02-09 | 2006-08-10 | Innovative Scientific Solutions, Inc. | Particle shadow velocimetry |
EP3018483A1 (en) * | 2014-11-07 | 2016-05-11 | photrack AG | Method and system for determining the velocity and level of a moving fluid surface |
Non-Patent Citations (1)
Title |
---|
RAINER HAIN ET AL: "On the possibility of using mobile phone cameras for quantitative flow visualization", 18TH INTERNATIONAL SYMPOSIUM ON THE APPLICATION OF LASER AND IMAGING TECHNIQUES TO FLUID MECHANICS?LISBON | PORTUGAL ?JULY 4 - 7, 7 July 2016 (2016-07-07), XP055513947 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11402855B2 (en) * | 2017-07-21 | 2022-08-02 | Nec Corporation | Processing device, drive control device, data processing method, and storage medium for attitude control of moving body based on wind conditions |
US11415592B2 (en) * | 2019-06-20 | 2022-08-16 | James Eagleman | Computing device and related methods for determining wind speed values from local atmospheric events |
US20240019876A1 (en) * | 2022-07-15 | 2024-01-18 | Wing Aviation Llc | Tether-Based Wind Estimation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10650235B2 (en) | Systems and methods for detecting and tracking movable objects | |
Cortés et al. | ADVIO: An authentic dataset for visual-inertial odometry | |
US10817735B2 (en) | Need-sensitive image and location capture system and method | |
US10401872B2 (en) | Method and system for collision avoidance | |
CN111670419A (en) | Active supplemental exposure settings for autonomous navigation | |
CN108965687A (en) | Shooting direction recognition methods, server and monitoring method, system and picture pick-up device | |
CN108419446A (en) | System and method for the sampling of laser depth map | |
CN112166459A (en) | Three-dimensional environment modeling based on multi-camera convolver system | |
WO2022077296A1 (en) | Three-dimensional reconstruction method, gimbal load, removable platform and computer-readable storage medium | |
JP2017528772A (en) | Method and system for generating route data | |
JP6087712B2 (en) | DISTRIBUTION DATA DISPLAY DEVICE, METHOD, AND PROGRAM | |
WO2019158222A1 (en) | Methods and user devices for determining wind speed | |
CN110493524A (en) | A kind of survey light method of adjustment, device, equipment and storage medium | |
Borgmann et al. | Data processing and recording using a versatile multi-sensor vehicle | |
CN115004273A (en) | Digital reconstruction method, device and system for traffic road | |
CN106289180A (en) | The computational methods of movement locus and device, terminal | |
Wang et al. | Aquatic debris monitoring using smartphone-based robotic sensors | |
CN103090796A (en) | Measuring system and measuring method for deflection and descending of rocket | |
US11421997B2 (en) | Map construction system and map construction method | |
WO2021232826A1 (en) | Wireless-positioning-technology-based method and device for controlling camera to dynamically track road target | |
JP7136138B2 (en) | Map generation data collection device, map generation data collection method, and vehicle | |
CN112580489A (en) | Traffic light detection method and device, electronic equipment and storage medium | |
US10685448B2 (en) | Optical module and a method for objects' tracking under poor light conditions | |
Cassinis et al. | Active markers for outdoor and indoor robot localization | |
Hadviger et al. | Stereo visual localization dataset featuring event cameras |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18709278 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18709278 Country of ref document: EP Kind code of ref document: A1 |