System And Method For Reporting Events

Heinonen; Tero ;   et al.

Patent Application Summary

U.S. patent application number 14/534803 was filed with the patent office on 2015-05-14 for system and method for reporting events. The applicant listed for this patent is Sharper Shape Ltd.. Invention is credited to Tero Heinonen, Juha Hyyppa, Anttoni Jaakkola.

Application Number20150130840 14/534803
Document ID /
Family ID53043441
Filed Date2015-05-14

United States Patent Application 20150130840
Kind Code A1
Heinonen; Tero ;   et al. May 14, 2015

SYSTEM AND METHOD FOR REPORTING EVENTS

Abstract

Disclosed is a system for reporting changes to a network in case of an event. The system includes a survey unit adapted to be located at a site of the network using data from a positioning sensor of the survey unit. The survey unit is configured to request from a control unit an augmented view related to the location of the site of the network and displays the augmented view in a display of the survey unit on top of a current view of the site. The survey unit is adapted to capture a photograph on the display and to communicate the photograph to the control unit. The control unit is configured to determine changes in the network by comparing the current view as shown in the photograph with the augmented view, and to create an event report including a catalog of the changes to the network.


Inventors: Heinonen; Tero; (Helsinki, FI) ; Hyyppa; Juha; (Espoo, FI) ; Jaakkola; Anttoni; (Espoo, FI)
Applicant:
Name City State Country Type

Sharper Shape Ltd.

Helsinki

FI
Family ID: 53043441
Appl. No.: 14/534803
Filed: November 6, 2014

Related U.S. Patent Documents

Application Number Filing Date Patent Number
61901492 Nov 8, 2013

Current U.S. Class: 345/633
Current CPC Class: G01S 17/08 20130101; G06Q 50/06 20130101; G01S 17/89 20130101; G06Q 10/0631 20130101
Class at Publication: 345/633
International Class: G06T 19/00 20060101 G06T019/00

Claims



1. A method for reporting changes to a network in case of an event, the method comprising steps of: requesting from a control unit, after the occurrence of an event, an augmented view of a site as before the occurrence of the event; overlapping the augmented view with a current view, by using a first survey unit; capturing a photograph of the current view along with the overlapped augmented view; sending the photograph to the control unit; and determining changes to the network by comparing the current view as shown in the photograph with the augmented view.

2. A method according to claim 1, wherein the augmented view is constructed using a mission prior data, the mission prior data being collected by using a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.

3. A method according to claim 1, wherein the current view is a camera view rendered on a display of the first survey unit.

4. A method according to claim 3, wherein the photograph contains the camera view along with the overlapped augmented view.

5. A method according to claim 2, wherein the method further comprises updating the mission prior data based on the determined changes to the network.

6. A system for reporting changes to a network in case of an event, the system comprising a first survey unit adapted to be located at a site of the network, using data from a positioning sensor of the survey unit, the survey unit being configured to request from a control unit an augmented view related to the location of the site of the network, the first survey unit being further adapted to display the augmented view in a display of the survey unit on top of a current view of the site, the first survey unit being also adapted to capture a photograph on the display and to communicate the photograph and the positioning sensor data to the control unit, the control unit being configured to receive and to store the photograph and the positioning sensor data from the first survey unit, the control unit being further configured to determine changes in the network by comparing the current view as shown in the photograph with the augmented view, and to create an event report including a catalog of the changes to the network to be accessible therefrom.

7. A system according to claim 6, wherein the augmented view is constructed using a mission prior data, the mission prior data being collected by using a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.

8. A system according to claim 6, wherein the current view is a camera view rendered on a display of the first survey unit.

9. A system according to claim 7, wherein the control unit is further configured to update the mission prior data based on the determined changes to the network.

10. A system according to claim 8, wherein the augmented view is rendered on the display of the first survey unit by executing with a processor computing instructions stored in a memory of the first survey unit, the computing instructions being configured to use the data from the position sensor to determine a direction of the camera view in relation to the site, capture an image of the camera view, and augment a view on the display using the data from the position sensor.

11. An apparatus for documenting changes to a network, the apparatus comprising: a communication interface; a camera; at least one location sensor for determining a location of the apparatus and a rotation of the apparatus relative to a ground level and to a map coordinates; a memory for storing computing instructions; and a processor configured to execute the computing instructions to request an augmented view of a site, based on the location of the apparatus, use the data from the at least one position sensor to determine a direction of a camera view in relation to the site, overlap the augmented view with the camera view using the data from the at least one position sensor, capture a photograph of the camera view along with the overlapped augmented view, and communicate the photograph to a device external to the apparatus.
Description



CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority to, and the benefit of, U.S. provisional Patent Application No. 61/901,492, filed on 8 Nov. 2013; and is related to, and claims the benefit of, U.S. Patent Application Ser. No. 61/901,489 filed on 8 Nov. 2013 entitled System for Monitoring Power Lines (Docket SLSH.2649.USU2/Sharpershape001); and U.S. Patent Application Ser. No. 61/901,490 filed on 8 Nov. 2013 entitled System and Method for Allocating Resources (Docket SLSH.2650.USU2/Sharpershape002); the disclosures of which are incorporated herein by reference in their entirety.

TECHNICAL FIELD

[0002] The present disclosure generally relates to a system and a method for reporting changes to a network in case of an event, and more particularly related to reporting and documenting changes concerning damages to infrastructure networks due to the event.

BACKGROUND

[0003] Infrastructure networks (such as power lines, water pipes, oil and gas pipes, etc.) are prone to damages over a period of time. Considering, for example, power lines (PL) networks which are usually extensive and comprise of several components like conductors, insulators, pylons and other associated structures such as spacer, dead-lines, switch boxes, etc. Such PL networks are often exposed to potential threats, mainly caused by encroaching vegetation, for example, as the tree grows it will be eventually so tall that in case it falls down during a storm it would break the power line. Furthermore, in case of calamities like a storm, a flood, an earthquake, a hurricane, or the like, a substantial amount of damage may occur to the PL network causing massive disruption to the power distribution and to the whole society dependent on electricity.

[0004] In all these circumstances, a quick and accurate analysis of the damage is of utmost importance for the electricity transmission and distribution operators, for the accurate assessment of the situation and subsequently to manage the repair work efficiently. Lack of proper and timely reporting leads to a situation that it is difficult to allocate personnel for repair work to appropriate places.

[0005] Substantial costs are involved in monitoring, identifying, reporting, documenting and accessing of damages to such networks. Traditionally this has been achieved primarily by relying on on-site manual inspection, however sending official representatives for reporting of damages to these infrastructure networks usually take lots of time. Moreover, in case of severe events such as major thunderstorm, the same event often has caused damages to access roads, or trees to fall onto roads, preventing outside personnel to access the site of damage without first clearing the roads, which can take days or weeks in the worst case.

[0006] Furthermore, independent reporting of these extensive networks and keeping the information up to date in a database, whether by the staff of the company or its subcontractors, is a time and resource consuming task. At the same time the people residing or staying in or near the site of damage are not capable to assessing the situation (as they do now know what to look for) or communicating the findings in a useful and understandable way to the damage assessment firm.

[0007] Therefore, there exists a need to devise a system that aims to solve the problem associated with reporting of damages to the infrastructure networks, and that overcomes the above-described limitations of existing systems.

BRIEF SUMMARY

[0008] The present disclosure provides a system and a method for reporting of changes to a network in case of an event. More specifically, the present disclosure relates to a system and a method for identifying changes concerning damages to an infrastructure network, and reporting and documenting of these damages to the network for assigning actions related to repair activities for such networks.

[0009] In one aspect, embodiments of the present disclosure provide a method for reporting changes to a network in case of an event. The method comprises steps requesting from a control unit, after the occurrence of an event, an augmented view of a site as before the occurrence of the event; overlapping the augmented view with a current view, by using a first survey unit; capturing a photograph of the current view along with the overlapped augmented view; sending the photograph to the control unit; and determining changes to the network by comparing the current view as shown in the photograph with the augmented view.

[0010] According to an embodiment, the method further comprises updating the mission prior data based on the determined changes to the network.

[0011] In another aspect, embodiments of the present disclosure provide a system for reporting changes to a network in case of an event. The system comprises a first survey unit adapted to be located at a site of the network using data from a positioning sensor of the survey unit. The survey unit is further configured to request from a control unit an augmented view related to the location of the site of the network. The first survey unit is further adapted to display the augmented view in a display of the survey unit on top of a current view of the site. The first survey unit is also adapted to capture a photograph on the display and to communicate the photograph and the positioning sensor data to the control unit. The control unit is configured to receive and to store the photograph and the positioning sensor data from the first survey unit. The control unit is further configured to determine changes in the network by comparing the current view as shown in the photograph with the augmented view, and to create an event report including a catalog of the changes to the network to be accessible therefrom.

[0012] According to an embodiment, the control unit is further configured to update the mission prior data based on the determined changes to the network.

[0013] In an example, the augmented view is rendered on the display of the first survey unit by executing with a processor computing instructions stored in a memory of the first survey unit. The computing instructions being configured to use the data from the position sensor to determine a direction of the camera view in relation to the site, capture an image of the camera view, and augment a view on the display using the data from the position sensor.

[0014] In an embodiment, the augmented view is constructed using a mission prior data. The mission prior data is collected by using a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.

[0015] In an example, the current view is a camera view rendered on a display of the first survey unit.

[0016] Further, the photograph contains the camera view along with the overlapped augmented view.

[0017] In yet another aspect, embodiments of the present disclosure provide an apparatus for documenting changes to a network. The apparatus comprises a communication interface, a camera, at least one location sensor for determining a location of the apparatus and a rotation of the apparatus relative to a ground level and to a map coordinates, a memory for storing computing instructions, and a processor configured to execute the computing instructions. The computing instructions configured to request an augmented view of a site, based on the location of the apparatus, use the data from the at least one position sensor to determine a direction of a camera view in relation to the site, overlap the augmented view with the camera view using the data from the at least one position sensor, capture a photograph of the camera view along with the overlapped augmented view, and communicate the photograph to a device external to the apparatus.

[0018] Additional aspects, advantages, features and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative embodiments.

[0019] It will be appreciated that features of the disclosure are susceptible to being combined in various combinations or further improvements without departing from the scope of the disclosure and this provisional application.

DESCRIPTION OF THE DRAWINGS

[0020] The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the disclosure is not limited to specific methods and instrumentalities disclosed herein. Wherever possible, like elements have been indicated by identical numbers.

[0021] FIG. 1 illustrates a pictorial representation of a system for reporting changes to a network in case of an event associated with an exemplary infrastructure network, in accordance with an embodiment of the present disclosure;

[0022] FIG. 2 illustrates a schematic diagram of an apparatus for documenting changes to a network, in accordance with embodiments of the present disclosure;

[0023] FIG. 3 illustrates a flow diagram for the event reporting system, in accordance with embodiments of the present disclosure; and

[0024] FIG. 4 is an illustration of steps of a method for reporting changes to a network in case of an event, in accordance with an embodiment of the present disclosure.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

[0025] The present disclosure provides a system 100 for reporting changes to a network in case of an event, hereinafter simply referred to as system 100. The system 100 of the present disclosure is configured for reporting and documentation of changes to an infrastructure network. In particular, the system 100 is configured to collect information related to damage(s) to any component or object in a network, optionally provide assessment of the damages based on some past information about the same component, and generate and document an event report with details of the damages for further perusal, such as, repair activities.

[0026] More specifically, the system 100 of the present disclosure enables some local user/personnel already present at the site of damage to collect information. The system 100 may further provide means to enable the said personnel to assess the damage to any component in the network based on the available information on the past condition of the same component, either already known to the personnel or provided by some other means in the system 100. In an embodiment, the system 100 may be additionally integrated with other systems involved with execution of actions related to repair activities for such networks, based on the damage report generated by the present system 100. Personnel can refers to any user(s) or person(s) independently on their contractual or employment status.

[0027] Referring now to the drawings, particularly by their reference numbers, FIG. 1 illustrates an embodiment of the present system 100 associated with an exemplary infrastructure network 200. For the purpose of the present disclosure, the system 100, as shown in FIG. 1, has been depicted in view of a system for damage reporting for a power lines (PL) network 200. Hereinafter, the terms "PL network", "infrastructure network" and "network" have been interchangeably used. Such a network 200 typically comprises several components 202 like, for example, poles 203, 204, 205; conductor wires 206, 207; insulators; pylons and other associated structures such as spacer, dead-lines, switch boxes, etc. These networks 200 are usually extensive and run through various territories, including urban areas, rural areas, country sides, forests, etc. As depicted in an exemplary embodiment of FIG. 1, the PL network 200 may be installed in a site 210, like a forest comprising of various objects 212 such as trees 214, 215, 216.

[0028] According to an embodiment of the present disclosure, the system 100 includes a survey unit, such as a first survey unit 110 (the term "first survey unit" interchangeably used as the survey unit herein later). The system also includes other survey units, such as second survey unit (not shown in the FIG. 1). The survey unit 110 is configured to collect information related to damage to the network 200. The survey unit 110 is configured for utility monitoring task, as in present case, monitoring of the components 202 and objects 212, and collecting remote sensing data 112 for the same. Example of remote sensing data 112 include; 3D point cloud (from Light Detection And Ranging, LiDAR), 3D point cloud (from Synthetic-Aperture radar, SAR radar), 2D image (from thermal, infrared, photographic camera, or SAR), or any other representation of the results of remote sensing represented in digital form of the components 202 and objects 212. Typically, LiDAR is used to denote a LiDAR system, although the word system is usually omitted. In present disclosure the term camera can refer, but is not limited, to: RGB (red green blue) camera, RGBN (RGB+near infrared) camera, infrared camera, near-infrared camera, thermal camera, video camera, high frequency video camera, multispectral camera, hyperspectral camera, multispectral video camera, hyperspectral video camera.

[0029] In an embodiment, the remote sensing data 112 includes mission prior data 114. The mission prior data 114 could be any available data related to the components 202 and the objects 212 before the occurrence of an event. Herein, the event could be natural such as, a flood, an earthquake, a storm, a hurricane, or the like; or man-made such as, construction activities, deforestation, etc. The mission prior data 114 is collected by using the second survey unit or other survey units present in the system 100 or it can be data which has been collected by for example from satellite data, using LiDAR in drones or helicopters, with mobile terminals etc. Alternatively, the mission prior data 114 may be collected by using the first survey unit 110.

[0030] In an embodiment, the mission prior data 114 is used to construct an augmented view related to a location of the site 210 of the network 200. The augmented view of the site 210 associated with the location of the site 210 as before the occurrence of the event.

[0031] For example, as shown, it may be contemplated that the mission prior data 114 may have positions of poles 203, 204, 205 and the conductor wires 206, 207 disposed between the poles 203, 204, 205. The view may also have information on trees 214, 215, 216 or other objects 212 present in the site 210. The augmented view accordingly includes the poles 203, 204, 205, the conductor wires 206, 207 disposed between the poles 203, 204, 205 and the trees 214, 215, 216 present in the site 210 before the event.

[0032] According to an embodiment, the remote sensing data 112 can also include mission current data 116, such as any available data related to the components 202 and the objects 212 after the occurrence of the event. In an embodiment, the mission current data 116 is used to construct a current view related to the location of the site 210 of the network 200 after the occurrence of the event, which in explained in greater detail herein later. The mission current data 116 may have information on the tree 215x which is now fallen down, as shown in FIG. 1. The view may also have information on the pole 205x which is now fallen down and/or on the conductor wire 207x which is now cut and fallen down.

[0033] It may be contemplated by a person ordinarily skilled in the art that the remote sensing data 112 may be absolute (as in specific coordinates), relative (to corresponding mission data 114, 116), or structural (e.g. topology or proximity between the components 202 and the objects 212). Further, the remote sensing data 112 may be discrete, or probabilistic (in a sense of probability distribution of the components 202 and the objects 212). Further it may be understood that the mission current data 116 could be similar to or different from the mission prior data 114.

[0034] The collection of the remote sensing data 112 involves regular monitoring of the components 202 and the objects 212. According to one embodiment of the present disclosure, each of the survey unit 110 may include at least one remote sensing equipment 118. The remote sensing equipment 118 may include digital remote sensing equipment and instruments such as LiDAR, SAR radar, thermal camera, camera, or video camera, x-ray radar, etc. The remote sensing equipment 118 may be located near by the target site 210 or may be located remotely to the site 210 gathering information by remote communication means. The remote sensing equipment 118 may be installed and operated from a mobile platform, for example a copter, fixed wing plane, an Unmanned Aerial Vehicle (UAV), Unmanned Aerial System (UAS), satellite, wheel drive terrain vehicle such as a car, forest machine, etc.

[0035] In an embodiment, the remote sensing equipment 118 includes LiDAR systems as a primary information source. LiDAR (also written LIDAR) is a remote sensing technology that measures distance by illuminating a target with a laser and analyzing the reflected light. The term "LiDAR" comes from combining the words light and radar. This emerging data acquisition tool provides an opportunity to classify a utility corridor scene more reliably and thus generate accurate 3D models of infrastructure features due to LiDAR's ability of highly dense and accurate, and multiple-echo data acquisition, which can also provide information on the internal structure of vegetation.

[0036] LiDAR uses ultraviolet, visible, or near infrared light to image objects and can be used with a wide range of targets, including non-metallic objects, rocks, rain, chemical compounds, aerosols, clouds and even single molecules. LiDAR systems employ a narrow laser beam which can be used to map physical features with very high resolution. Wavelengths from about 10 micrometers to the UV (ca. 250 nm) are used to suit the target. Typically light is reflected via backscattering. Different types of scattering are used for different LiDAR applications; most common are Rayleigh scattering, Mie scattering, Raman scattering, and fluorescence. Based on different kinds of backscattering, the LiDAR can be accordingly called Rayleigh LiDAR, Mie LiDAR, Raman LiDAR, Na/Fe/K Fluorescence LiDAR, and so on. Suitable combinations of wavelengths can allow for remote mapping of atmospheric contents by looking for wavelength dependent changes in the intensity of the returned signal.

[0037] According to an alternative and a preferred embodiment of the present disclosure, the survey unit 110 may be constituted by personnel 120 already located at the site 210. The personnel 120, as a part of the survey unit 110, is equipped with a mobile terminal 122 such as, but not limited to, a smart phone, a laptop, a tablet, a smart camera, or some combination thereof.

[0038] Referring now to FIG. 2, illustrated is an exemplary embodiment of the terminal 122. The terminal 122 for all intents and purposes includes a display 122a, a camera 122b, a user interface 122c, a communication interface 122d, a central processing unit (CPU) 122e, a sensor unit 122f including compass, accelerometer, magnetometer, global navigation satellite system (GNSS) such as global positioning system (GPS) sensor, and other components such as memories, etc. In addition, the terminal 122 also includes a power source 122g such as, batteries to provide electricity for the above listed parts in the mobile terminal 122.

[0039] It may be understood by a person skilled in the art that the various parts in the mobile terminal 122 function together to collect and analyze some remote sensing data 112 from the site 210. For example, the camera 122b and the sensor unit 122f are connected for making analysis of the view and surroundings, which explained in greater detail herein later. The sensor unit 122f can include an accelerometer to determine tilting angle of the terminal 122, a magnetometer to determine direction of the terminal 122 in respect to the magnetic field of earth, a location sensor (GPS) to determine longitude and latitude of the terminal 122, etc. The sensor unit 122f can be, for example, similar to a Kinect sensor of Microsoft.RTM. or a range camera, that is, the sensor unit 122f may be a horizontal bar connected to a small base with an optionally motorized pivot and is designed to be positioned lengthwise above or below the terminal 122. The sensor unit 122f may further features a "RGB" camera, a depth sensor and a multi-array microphone running proprietary software, which provides full-body 3D motion capture, facial recognition and voice recognition capabilities.

[0040] Further, the central processing unit 122e, in the terminal 122, may include related memories (non-transitory, flash memory, memory cards) for running the software needed for operation. The communication interface 122d may include one or combination of cellular interface [2G, 3G, 4G, 4G LTE (Long Term Evolution)], etc.), or Wireless Local Area Network (WLAN) interface, generally for accessing internet. The user interface 122c may include display and touch screen/buttons for the personnel 120 to operate the terminal 122.

[0041] Essentially, the mobile terminal 122 could be any device or combination of devices capable of collecting the mission prior data 114 indicative of the components 202 and objects 212 before the occurrence of the event at the site 210. The mission prior data 114 is most commonly in the form of a picture or views of locations of the site 210, having the components 202 and objects 212, before the event.

[0042] In an example, the mission prior data 114 includes pictures captured using the camera 122b, showing the state/position of various components 202 in the network 200, and possibly in some relation to the objects 212 in the site 210 before the event. Such pictures may be stored in an external device, and requested as augmented views from the external device, which is explained in greater detail herein later. In an example, creating an augmented view or augment a view using the terminal 122 involves using the data from the position sensor, such as the sensor 122f, to determine a direction of the camera view in relation to the site 210, capture an image of the camera view, and augment a view on the display 122a using the data from the position sensor, i.e. add tags (with positioning sensor data) for further storage and processing. Additionally, the augmented view can constructed using the mission prior data 114 collected by using a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.

[0043] Similarly, the mission current data 116 includes pictures (current views) captured by using the camera 122b, showing the state/position of various components 202 in the network 200, and possibly in some relation to the objects 212 in the site 210 after the event. For example, in the process of generating such pictures the display 122a of the terminal 122 is brought relative to at least one of the various components 202 or the objects 212 present in the network 200 based on the positioning sensor data.

[0044] Referring again to FIG. 1, the system 100 includes a control unit 130 having a server 132 and a database 134. The control unit 130 is configured to communicate, over the communication interface 122c such as Internet, with the terminal 122. In an embodiment, the server 132 may be configured to receive the mission current data 116 (current view) related to the components 202 and the objects 212 in the network 200, from the survey unit 110. Specifically, the server 132 is adapted to receive the mission current data 116 from the mobile terminal 122. The server 132 may also be adapted to receive the mission prior data 114. The received data 114, 116 may be stored in the provided database 134, from where it can be accessed by other modules of the system 100.

[0045] In an embodiment, the server 132 may further be configured to send the mission prior data 114 to the survey unit 110, or specifically the terminal 122 for the perusal of the personnel 120. This mission prior data 114 may be sent in the form of the augmented view representative of the original state/position of the various objects 212 in the site 210, before any change/damage.

[0046] In an example, the survey unit 110, or specifically the terminal 122, is adapted to be located at the site 210 of the network 200, using data from a positioning sensor of the survey unit 110. The survey unit 110 is configured to request from the control unit 130 the augmented view related to the location of the site 210 of the network 200.

[0047] The augmented view is rendered on the display 122a of the terminal 122 by executing with a processor (i.e. the central processing unit 122e) computing instructions stored in the memory of the terminal 122. In an example, the computing instructions are configured to use the positioning sensor data for rendering the augmented view is rendered on the display 122a. The survey unit 110, or specifically the terminal 122, is further adapted to display the augmented view in the display 122a of the terminal 122 on top of a current view of the site 210. The current view includes information on the tree 215x which is now fallen down, as shown in FIG. 1. The view may also have information on the pole 205x which is now fallen down and/or on the conductor wire 207x which is now cut and fallen down. The current view is a camera view rendered on the display 122a of the terminal 122.

[0048] The survey unit 110 being also adapted to capture a photograph on the display 122a and to communicate the photograph and the positioning sensor data to the control unit 130. The photograph contains the camera view along with the overlapped augmented view. The control unit 130 is configured to receive and to store the photograph and the positioning sensor data from the survey unit 110. The control unit 130 is further configured to determine changes in the network 200 by comparing the current view as shown in the photograph with the augmented view. The control unit 130 is further configured to create an event report including a catalog of the changes to the network 200 to be accessible therefrom. The control unit 130 is further configured to update the mission prior data based on the determined changes to the network.

[0049] In an example, the display 122a may show, by the view finder of the terminal 122, roughly a same perspective view V.sub.1 (augmented view requested from the control unit 130) as would be a view V.sub.2 (current view) presently seen by the personnel 120 from his/her current position. The software in the terminal 122 achieves this by using the sensor information from the terminal 122 and communicating the same to the control unit 130, which generates the view V.sub.1 in consideration of this sensor information. In an embodiment, the view V.sub.1 may be shown as a dashed line figure or with some transparent means for the perusal of the personnel 120.

[0050] The personnel 120 then tries to direct the terminal's camera 122b to overlap the dashed line figure of the view V.sub.1 with the current view V.sub.2, that is, tries to position the transparent recorded image with the reality. The terminal 122 may be configured to determine differences between the two views, indicative of changes/damages to the PL network 200. This could be done automatically by the CPU 122e by performing comparative analysis of the views V.sub.1 and V.sub.2. Alternatively, the terminal 122 may include the option to allow the personnel 120 to indicate the changes by means of a provided user interface. The personnel 120 could, for example, use a touch screen of the terminal 122 to circle one or more objects which are changed from the image view V.sub.1 from the database 134. The said information is sent back to the database 134 as an event report indicative of the damages/changes in the power line network 200.

[0051] Still alternatively, the overlapped views V.sub.1 and V.sub.2 may be shared with the control unit 130. For example, the terminal 122 is adapted to capture a photograph (overlapped views V.sub.1 and V.sub.2) on the display 122a and to communicate the photograph to the control unit 130. The control unit 130 may include the requisite software/service to perform the comparative analysis of the views (photograph) and identify the differences. Further, the event report is generated which could be accessible from the database 134 for the perusal of some operators or agencies responsible for repair activities to mitigate the damages to the network 200.

[0052] In an embodiment, the data transfer between the terminal 122 and the server 132 may be done by any of two scenarios. In a first scenario, the raw data is transferred from the terminal 122 including location, video stream, depth map, point clouds, figures, direction of making the visual data, and the analysis of the data is performed in the server 132. In another scenario, some of the analysis is done locally and the results are sent to the server 132. This way the amount of data to be transferred, for example, via a cellular network, can be made smaller as compared to the first scenario.

[0053] In accordance with an additional embodiment of the present disclosure, the control unit 130 may be configured to send the views to multiple users/personnel 120 equipped with the terminal 122. These users may be present in the same locality, such as, the locality at the site of damage, or spread over some geographical area. Each user may be enabled to analyze, comment, or rate the images (formed by the overlapping of the two views) in order to identify the differences. This crowdsourcing of the analysis provides a larger resource pool and therefore results in better damage assessment for the event report.

[0054] Moving on, FIG. 3 provides an architecture related to the system 100 of the present disclosure. In step S2.1, the personnel 120 sends a position of the terminal 122 and a direction of view finder i.e. the camera 122b of the terminal 122. The direction may be deducted from the accompanying sensor information such as, compass direction (say, 56 degrees to North) and tilting angle of the camera with respect to ground (say, 10 degrees in relation to horizontal), latitude, longitude, etc. The terminal 122 can also send a current view, such as the view V.sub.2 (i.e. a camera view appearing on the display 122a of the camera 122b of the terminal 122) to the server 132.

[0055] In step S2.2, the server 132 scans the database 134 to generate/find/determine viewable reference data V.sub.1 (such as image, model, shape, drawing, or outline) related to the present view V.sub.2 in the viewfinder. The view V.sub.1 is the augmented view stored in the database 134 of the control unit. Bases on an embodiment, the view V.sub.1 is calculated from a set of 3d point data which data have been previously measured from the said position of the terminal 122.

[0056] According to an embodiment, the view V.sub.1 may be generated by the terminal 122 (for being stored in the database 134 of the control unit 130). Alternatively, the view V.sub.1 may be provided by a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.

[0057] In step S2.3, the view V.sub.1 is communicated back to the terminal 122. The view V.sub.1 can consists of one or more views. The view V.sub.1 can be of the type an image (digital photograph), a depth map (image showing distance to different parts of the view), a point cloud, a thermal image, or a generated image, illustration, model, shape, drawing, or outline depending on capabilities of the terminal 122.

[0058] In step S2.4, the personnel 120 aims to align the received view V.sub.1 with the view V.sub.2. The personnel 120 can indicate with a touch screen or other user interface means objects which have changed in comparison to received view. Specifically, the changes in the components 202 and objects 212 can be deducted with the comparison of the view V.sub.1 and the V.sub.2. The personnel 120 can, for example, indicate which of the power line poles are missing or have fallen down. For example, as shown in FIG. 1, the broken pole 205x and the broken conductor wire 207x can be indicated by the personal 120. In an embodiment, a menu is provided to the personnel 120 upon touching an object or a component appearing in the display 122a of the terminal 122. The menu can have for example symbols such as fallen down, broken, disappeared, tilted, no changes, etc. or related texts. In an embodiment, the menu may also have free text input fields for the personnel 120 to provide notes related to damage. Further, the menu information may be used to create an event report including the catalog of the changes to the network.

[0059] In step S2.5, the generated information (related to the status of the components 202 and the objects 212) is sent to the service in the server 132. Further in step S2.6, this information is stored in the database 134 as updated object information. In an embodiment, the update can be accepted automatically or it can be subject to verification by a service provider. Next time the personnel 120 accesses the database 134, the updates/changes are reflected on the received view V.sub.1 at the terminal 122.

[0060] Based on embodiments in step S2.7, the server 132 analyses the differences between the views and creates an event report including a catalog of changes. More specifically, a photograph of the view V.sub.1 overlapped on the V.sub.2 is captured by the terminal 122 and thereafter the terminal 122 sends the photograph to the server 132. The server 132 analyses the photograph to create the event report including the catalog of the changes to the network, which can be determined from the comparison of the view V.sub.1 overlapped on the V.sub.2.

[0061] In step S2.8, in case the event report indicates damage which needs to be corrected, such as the broken pole 205x and the broken conductor wire 207x, the information is communicated to some third-party system such as repairing agencies. The third-party may receive information, either as a message, a push message or information accessible via Internet connection. The third-party may subsequently analyze the updated information and plan for possible corrective/repair actions at the site 210.

[0062] Referring now to FIG. 4, illustrated is a method 400 for reporting changes to a network in case of an event, in accordance with an embodiment of the present disclosure.

[0063] At step 402, an augmented view is requested from a control unit, after the occurrence of an event. The augmented is associated with a site as before the occurrence of the event. According to an embodiment, the augmented view is constructed using a mission prior data. The mission prior data is collected by using at one of a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.

[0064] At step 404, the augmented view is overlapped with a current view, by using a first survey unit. In an example, the current view is a camera view rendered on a display of the first survey unit. The first survey unit is adapted to be located at the site of the network, using data from a positioning sensor of the survey unit. The survey unit is configured to request from the control unit the augmented view related to the location of the site of the network.

[0065] At step 406, a photograph of the current view along with the overlapped augmented view is captured. In an example, the photograph contains the camera view along with the overlapped augmented view.

[0066] At step 408, the photograph is sent to the control unit.

[0067] At step 410, changes to the network are determined by comparing the current view as shown in the photograph with the augmented view.

[0068] The steps 402 to 410 are only illustrative and other alternatives can also be provided where one or more steps are added, one or more steps are removed, or one or more steps are provided in a different sequence without departing from the scope of the claims herein.

[0069] For example, the method 400 further includes updating the mission prior data based on the determined changes to the network. Further, the method includes creating an event report including a catalog of the changes to the network.

[0070] In yet another aspect, embodiments of the present disclosure provide an apparatus for documenting changes to a network. The apparatus includes a communication interface, a camera, at least one location sensor for determining a location of the apparatus and a rotation of the apparatus relative to a ground level and to a map coordinates, a memory for storing computing instructions, and a processor configured to execute the computing instructions. The computing instructions configured to request an augmented view of a site, based on the location of the apparatus, use the data from the at least one position sensor to determine a direction of a camera view in relation to the site, overlap the augmented view with the camera view using the data from the at least one position sensor, capture a photograph of the camera view along with the overlapped augmented view, and communicate the photograph to a device external to the apparatus. The external device can be a control unit as explained above.

[0071] In further embodiment, the present disclosure utilizes an apparatus, such as the terminal 122. In an example, the apparatus can be a smart phone with a camera and a QR (quick response) code reading application. Poles and other objects in power lines could have a QR code (or other identifier such as a Radio Frequency Identifier (RFID) which can be read with a smart phone). When a person (which can be in practice any person) sees a damaged object such as a fallen power line, the person scans the identifier with the phone. In case of QR code, the camera of the phone is pointed to the QR-code. The application in the phone connects to a service and forms an event report. The application can form data connection, send a short message service (SMS) message, email, multimedia service message (MMS) etc. The application can be a dedicated application for reporting or it can be for example browser in the phone. In the latter case the QR code would be used to connect to a reporting web site and to post at the same time a unique identification of the object to the system. The application in the phone can be further configured to provide or allow user to provide location co-ordinates where the reporting is made. In an example embodiment the application would send the identification read from the QR-code in the pole and the GPS location of the phone at the time of scanning the code. Additionally the application or the web site can include a form for the user to make an event report or use menus to select a type of incident and to add an image/photo taken from the place. The system could be configured to give reward such as money or other credits to users who report the damages. In addition to reporting damages the users could also report possible future problems. The power line companies could give incentive for event reports depending on possible cost savings on resulting on the event report.

[0072] According to another embodiment, the system 100 configured to form an overview of the situation of the site 210 of the network 200. For example, the control unit 130 might be configured to receive data from a plurality of sources such as a satellite, a drone, helicopters, a LiDAR system, multiple users with mobile terminals etc., to form an overview of the situation using multiple data sources. For example in some areas the data might include only one picture taken with a mobile phone and in some other areas there might be video coverage, satellite images and photos.

[0073] It is intended that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not limiting of the scope of the disclosure. Expressions such as "including", "comprising", "incorporating", "consisting of", "have", "is" used to describe the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural.

[0074] While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad present disclosure, and that this present disclosure is not limited to the specific constructions and arrangements shown and described, since various other modifications and/or adaptations may occur to those of ordinary skill in the art. It is to be understood that individual features shown or described for one embodiment may be combined with individual features shown or described for another embodiment.

* * * * *


uspto.report is an independent third-party trademark research tool that is not affiliated, endorsed, or sponsored by the United States Patent and Trademark Office (USPTO) or any other governmental organization. The information provided by uspto.report is based on publicly available data at the time of writing and is intended for informational purposes only.

While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, reliability, or suitability of the information displayed on this site. The use of this site is at your own risk. Any reliance you place on such information is therefore strictly at your own risk.

All official trademark data, including owner information, should be verified by visiting the official USPTO website at www.uspto.gov. This site is not intended to replace professional legal advice and should not be used as a substitute for consulting with a legal professional who is knowledgeable about trademark law.

© 2024 USPTO.report | Privacy Policy | Resources | RSS Feed of Trademarks | Trademark Filings Twitter Feed