{This is an html version of my Response to the First Office Action. It does not have line numbers, or even page numbers. It does not include the Appendix. The PDF version (including the Appendix) is available at my blog for the issue. For that Click here.  JM }

 

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE

 

 

In re Application of Jed Margolin    

Serial No.: 11/736,356                                                                       Examiner: Ronnie M. Mancho

Filed: 04/17/2007                                                                               Art Unit: 3664

For: SYSTEM AND METHOD FOR SAFELY FLYING UNMANNED AERIAL VEHICLES IN CIVILIAN AIRSPACE

 

Mail Stop Amendment

Commissioner for Patents

P.O. Box 1450

Alexandria, VA 22313-1450

 

RESPONSE

 

Dear Sir:

 

            In response to the Office Action mailed September 1, 2010, please consider the following remarks.

 

Section 1.   General Summary

Claims 1 - 14 were rejected solely under 35 U.S.C. §103(a) as being obvious by combining U.S. Patent 5,904,724 (“Margolin ‘724”) and published Patent Application US 2005004723 (“Duggan”). Applicant will show that the Examiner has failed his burden of establishing a prima facie case of obviousness.

a.  The Examiner has failed to distinctly point out where all of the claim elements and limitations of Applicant’s claims are present in the two cited references.

b.  The Examiner has mischaracterized the two cited references as teaching all of the claim elements and limitations of Applicant’s claims, when they do not.

c.   The present Applicant is the named inventor on one of the Examiner’s cited references (U.S. Patent 5,904,724).

 

Section 2 - Detailed Response

 

Part A - Examiner’s Detailed Action Paragraph 2

 

2.         Claims 1-14 are rejected under 35 U.S.C. 103(a) as being unpatentable over Margolin (5904724) in view of Duggan et al (US 2005004723).

 

 

Regarding claim 1, Margolin (abstract; figs. 1-7; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67) discloses a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

 

(a)  a ground station 400 (fig. 1 & 4) equipped with a synthetic vision system (figs. 1 &3; col. 4, lines 1 to col. 5, lines 67);

 

(b)  an unmanned aerial vehicle 300 (figs. 1 &3) capable of supporting said synthetic vision system (305, 306, 307, 311 on aircraft; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-­67);

 

(c)  a remote pilot 102 operating said ground station 400 (figs. 1&4; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67);

 

(d)  a communications link between said unmanned aerial vehicle 300 and said ground station 400;

 

(e)  a system onboard said unmanned aerial vehicle 300 for detecting the presence and position of nearby aircraft (305, 306, 307, 311 on aircraft) and communicating this information to said remote pilot 102 (col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67);

 

whereas said remote pilot uses said synthetic vision system (305, 306, 307, 311 on aircraft) to control said unmanned aerial vehicle 300 during at least selected phases of the flight of said unmanned aerial vehicle.

 

 

Applicant Responds.

MPEP § 2142 states under the heading ESTABLISHING A PRIMA FACIE CASE OF OBVIOUSNESS

 

a.   **>The key to supporting any rejection under 35 U.S.C. 103 is the clear articulation of the reason(s) why the claimed invention would have been obvious. The Supreme Court in KSR International Co. v. Teleflex Inc., 550 U.S. ___, ___, 82 USPQ2d 1385, 1396 (2007) noted that the analysis supporting a rejection under 35 U.S.C. 103 should be made explicit. The Federal Circuit has stated that "rejections on obviousness cannot be sustained with mere conclusory statements; instead, there must be some articulated reasoning with some rational underpinning to support the legal conclusion of obviousness." In re Kahn, 441 F.3d 977, 988, 78 USPQ2d 1329, 1336 (Fed. Cir. 2006). See also KSR, 550 U.S. at ___ , 82 USPQ2d at 1396 (quoting Federal Circuit statement with approval). <

 

 

{Emphasis added}

 

The Examiner has cited lengthy passages in the above rejection and made conclusory statements as to their contents.

 

Examiner:

Regarding claim 1, Margolin (abstract; figs. 1-7; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67) discloses a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

 

Applicant:

In Margolin ‘724: Column 3, lines 8-67; Column 4, lines 1-67; and Column 5, lines 1-67 form a continuous passage from Column 3, line 8 to Column 5, line 67. This passage of approximately 1619 words forms the core of the Margolin ‘724 DETAILED DESCRIPTION. The remainder of the Margolin ‘724 DETAILED DESCRIPTION teaches additional topics such as Flight Control (with headings Flight Control, Direct Control Non-Remotely Piloted Vehicles, Computer Mediated Non-Remotely Piloted Vehicles, Second Order Flight Control Mode, First Order Flight Control Mode {See Column 6, line 19 - Column 8, line 3}, the features of a Control Panel (See Column 8, line 64 - Column 9, line 18}, the use of a Head-Mounted Display {See Column 9, lines 19 - 32}, the use of the invention for training {See Column 9, lines 33 - 63}, and The Database {See Column 9, line 64 - Column 10, line 50.} 

 

The Examiner cites Figures 1 - 7 in Margolin ‘724. These constitute all the figures in Margolin ‘724.

 

The Examiner also cites the Abstract in Margolin ‘724. According to 608.01(b) Abstract of the Disclosure [R-7]:

37 CFR 1.72 Title and abstract.

*****

(b) A brief abstract of the technical disclosure in the specification must commence on a separate sheet, preferably following the claims, under the heading "Abstract" or "Abstract of the Disclosure." The sheet or sheets presenting the abstract may not include other parts of the application or other material. The abstract in an application filed under 35 U.S.C. 111 may not exceed 150 words in length. The purpose of the abstract is to enable the United States Patent and Trademark Office and the public generally to determine quickly from a cursory inspection the nature and gist of the technical disclosure.<

 

{Emphasis added}

 

The popular interpretation of 608.01(b) is that the purpose of the Abstract is to provide search terms. In any event, the Abstract in Margolin ‘724 does not say anything about civilian airspace.

 

The Examiner has made a conclusory statement by repeating the title of Applicant’s invention (leaving out the words “and method”) and citing the core of the DETAILED DESCRIPTION in Margolin ‘724.

 

In the remaining sections of the Examiner’s rejection of Applicant’s Claim 1 he asserts that he has found all of the elements and limitations of Applicant’s invention.

 

It is not surprising that some of the elements of Applicant’s invention are present in Margolin ‘724 since Margolin ‘724 is probably the pioneering patent for the use of what is now called synthetic vision in remotely piloted aircraft (now commonly called Unmanned Aerial Vehicles) and Applicant’s present invention uses synthetic vision as an element.

 

However, there are limitations in Applicant’s current invention that are not present in Margolin ‘724.

 

Examiner:

 

whereas said remote pilot uses said synthetic vision system (305, 306, 307, 311 on aircraft) to control said unmanned aerial vehicle 300 during at least selected phases of the flight of said unmanned aerial vehicle.

 

{From Applicant’s Claim 1}

 

References 305, 306, 307, 311, and 300 come from Margolin ‘724 Figure 3 which shows the structural elements in Margolin ‘724 Remote Aircraft Unit 300. There is nothing in these structural elements which show that synthetic vision is used “during at least selected phases of the flight of said unmanned aerial vehicle.”

 

The Examiner has not shown that this limitation is taught in Margolin ‘724. He has only made a  conclusory statement.

 

Although KSR may have loosened the required reasoning that may be employed for combining prior art references in an obviousness rejection, the Examiner must still provide a factual basis for each of the claimed features of a rejected claim.  MPEP 2143.03 entitled “All Claim Limitations must be Considered” states: “all words in a claim must be considered in judging the patentability of that claim against the prior art.” In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970).”

If an examiner fails to address all of the recitations of a rejected claim, a prima facie case of obviousness has not been established because such a deficiency fails to satisfy the evidentiary requirements articulated by the Supreme Court in KSR (e.g. “the key to supporting any rejection under 35 U.S.C. 103 is the clear articulation of the reason(s) why the claimed invention would have been obvious” and that “a rejection under 35 U.S.C. 103 should be made explicit.”)

The BPAI in a recent decision (Ex parte Wehling et al.) stated (with emphases added}: “the dispositive issue in this case is whether the Examiner has explicitly articulated a prima facie case of obviousness which addresses all of the limitations of the claimed invention.” The BPAI was guided by the following legal principles:

“When determining whether a claim is obvious, an Examiner must make ‘a searching comparison of the claimed invention – including all its limitations – with the teachings of the prior art.’ In re Ochiai, 71 F.3d 1565, 1572 (Fed. Cir. 1995) (emphasis added). Thus, ‘obviousness requires a suggestion of all limitations in a claim.’ CFMT, Inc. v. Yieldup Int’l. Corp., 349 F.3d 1333, 1342 (Fed. Cir. 2003) (citing In re Royka, 490 F.2d 981, 985 (CCPA 1974)). Furthermore, in KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 418 (2007) (citing In re Kahn, 441 F.3d 977, 988 (Fed. Cir. 2006), the Supreme Court noted that ‘[t]o facilitate review, this [obviousness] analysis should be made explicit.’” (Ex parte Wehling et al., Appeal No. 2009-8111 (BPAI))

The BPAI in Ex Parte Wehling et al. held that “absent a fact-based analysis which explicitly compares all the limitations of the claimed invention with the combined teachings of Gioffre and Rockliffe, we are constrained to reverse the rejection of claims 1, 21, 29, and 31 and the claims dependent thereon under § 103 over the combined teachings of Gioffre and Rockliffe.”

Note that Ex Parte Wehling et al.(Appeal 2009-008111, Application 10/743,118) was decided May 17, 2010. According to the BPAI online database the decision was issued 10/19/2010 which is after the mail date of the Examiner’s rejection (9/1/2010).

 

Examiner’s Detailed Action Paragraph 2 (Continued)

 

The Examiner continues

 

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

a ground station controlling an unmanned aerial vehicle (sec. 0352, 00353), wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318, 0322, 0353) when a synthetic vision (sec. 0356, 0365, 0388, 0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system (autopilot, sec 0346 to 0350, 0390-0329).

Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to modify Margolin as taught by Duggan for the purpose of incorporating an autopilot to ensure smooth transitions (Duggna abstract, sec 0014, 0085, 0086).

The different embodiments in both prior arts are combinable as it would be obvious to ne[sic] having ordinary skill in the art.

 

(Applicant assumes Examiner meant to say, “The different embodiments in both prior arts are combinable as it would be obvious to one having ordinary skill in the art.)

 

The Examiner has mischaracterized Duggan.

 

Examiner

Duggan

 

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

a ground station controlling an unmanned aerial vehicle (sec. 0352,

 

 

00353),

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318,

 

 

 

 

 

 

 

 

0322,

 

 

 

 

 

 

 

 

 

 

 

0353)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

when a synthetic vision (sec. 0356,

 

0365,

 

 

 

 

 

 

 

 

 

 

0388,

 

 

 

 

 

 

 

 

0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

(autopilot, sec 0346 to 0350,

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

0390-0329).

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to modify Margolin as taught by Duggan for the purpose of incorporating an autopilot to ensure smooth transitions (Duggna abstract, sec 0014, 0085, 0086).

     The different embodiments in both prior arts are combinable as it would be obvious to ne [sic] having ordinary skill in the art.

 

 

[0352]   In one aspect of the present invention, an operator station (also referred to as the ground control station or GCS) is designed to accommodate command and control of multiple vehicles or a single vehicle by a single operator. In accordance with one embodiment, the ground control station is platform independent and implements an application program interface that provides windowing and communications interfaces (e.g., the platform is implemented in Open Source wxWindows API). The underlying operating system is illustratively masked and enables a developer to code in a high level environment.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0318] If the pilot chooses a surveillance location outside the total FOV, then the outer loop guidance will illustratively follow a command-to-LOS mode guide law until the UAV flight path points toward the target. Once the desired staring-point comes within a minimum range threshold, the guidance automatically trips into a loiter pattern (either constant-radius or elliptical) to maintain a station with a single key-click while he/she conducts other activities. FIGS. 22A & 22B together demonstrate the surveillance-point approach scenario.

 

[0322] In accordance with one aspect of the present invention, sensor-slave mode commands are generated by an autonomous line-of-sight driven function, in which the command objectives are generated by the necessities of the function rather than by an operator. For example, a function designed to command a raster-scan of a particular surveillance area, or a function designed to scan a long a roadway could be used to generate sensor slave commands. Another example is a function designed to generate line-of-sight commands for UAV-to-UAV rendezvous formation flying.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0356] a synthetic vision display

 

[0365] The two video monitors are illustratively used to display real-time data linked camera imagery from two air vehicles having cameras (of course, fewer, more or none of the vehicles might have cameras and the number of monitor displays can be altered accordingly). In accordance with one embodiment, camera imagery is recorded on videotapes during a mission. In accordance with one embodiment, the two repeater displays are used to provide redundant views of the GUI and synthetic vision display. The laptop illustratively serves as a GUI backup in the event that the main GUI fails.

 

[0388] In one aspect of the present invention, synthetic vision display technical approach of the present invention is based upon integrating advanced simulated visuals, originally developed for training purposes, into UAV operational systems. In accordance with one embodiment, the simulated visuals are integrated with data derived from the ground control station during flight to enable real-time synthetic visuals.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0346] In accordance with one embodiment, an exemplary translation layer implementation will now be provided. After the guidance algorithms execute, the outputs are translated to the native vehicle autopilot commands. The equations below provide example kinematic translations from the guidance acceleration commands to native vehicle autopilot commands. These equations demonstrate the principal that vehicle motion is activated through acceleration. The methods that various vehicles employ to generate acceleration are numerous (bank angle autopilot, acceleration autopilot, heading control autopilot, altitude control autopilot, etc). Since the control algorithms described herein generate acceleration commands that can be kinematically translated into any of these native autopilot commands, the guidance algorithms truly provide a generalized library of control laws that can control any vehicle through that vehicle's native atomic functions. Ubiquitous acceleration control techniques enable VACS to synthesize control commands for any vehicle, including air, ground, or sea-based. 35 a v = vertical plane acceleration command a h = horizontal plane acceleration command = tan - 1 ( a h a v ) = bank angle command a T = a v 2 + a h 2 = total body acceleration command . = a h V = turn rate command i = i - 1 + . t = heading command . = ( a v - g ) V = flight path rate command i = i - 1 + . t = flight path angle command h . = V sin ( ) = climb rate command h i = h i = 1 + h . t = altitude command Eq . 57

 

[0347] Additional functionality that can be enabled in a translation layer is means for discouraging or preventing an operator (e.g., the human or non-human operator interfacing the VACS architecture) from overdriving, stalling, or spinning the vehicle frame. This being said, limiting algorithms can also be employed in the guidance or autopilot functions.

[0348] X. Autopilot

[0349] As has been addressed, the present invention is not limited to, and does not require, a particular autopilot system. The control system and architecture embodiments of the present invention can be adapted to accommodate virtually any autopilot system.

[0350] For the purpose of providing an example, an illustrative suitable autopilot software system will now be described. The illustrative autopilot system incorporates a three-axis design (pitch and yaw with an attitude control loop in the roll axis) for vehicle stabilization and guidance command tracking. The autopilot software design incorporates flight control techniques, which allow vehicle control algorithms to dynamically adjust airframe stabilization parameters in real-time during flight. The flight computer is programmed directly with the airframe physical properties, so that it can automatically adjust its settings with changes in airframe configuration, aerodynamic properties, and/or flight state. This provides for a simple and versatile design, and possesses the critical flexibility needed when adjustments to the airframe configuration become necessary. The three-loop design includes angular rate feedback for stability augmentation, attitude feedback for closed-loop stiffness, and acceleration feedback for command tracking. In addition, an integral controller in the forward loop illustratively provides enhanced command tracking, low frequency disturbance rejection and an automatic trim capability.

 

{The Examiner may have meant 0390-0392. Otherwise the range is not credible}

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0391] The described Intelligent displays with smart variables represent an effective approach to actively displaying information for different types of vehicles. However, a problem can arise when a new vehicle is integrated into the ground control station with a completely foreign command and control interface. Under these circumstances, the ground control station is not concerned about displaying data, but is tasked to provide a command and control interface for the operator to perform the required operations. This conundrum is the motivation for another embodiment of the present invention, namely, the integration of vehicle specific panels in the ground control station.

 

[0392] In one embodiment, a generic vehicle class (GVC) is illustratively a software component that provides a rapid development environment API to add new vehicle classes and types to the ground control station. The GVC also illustratively serves as a software construct that allows the inclusion of multiple vehicles within the ground control station framework. One of the variables in the application is a vector of pointers to a generic vehicle class. This list is constructed by allocating new specific vehicles and returning a type case to the base generic vehicle class. When a new vehicle is integrated into the ground control station, the generic vehicle class provides all of the virtual functions to integrate with system control components (e.g., to integrate with a map display, a communications package, PCIG imagery and/or appropriate display windows). An important object in the application framework is illustratively a pointer to the current vehicle generic class. When the user switches vehicles, this pointer is updated and all displays grab the appropriate smart variables from the pointer to the new base class. This is the mechanism by which windows immediately update to the current vehicle information whenever the user switches vehicles. The default windows use the pointer to the current vehicle to grab information. In this manner, if the user switches to a new vehicle with a different set of datalink variables, that fact is immediately apparent on the display windows.

 

 

 

 

 

 

 

Abstract

Embodiments are disclosed for a vehicle control system and related sub-components that together provide an operator with a plurality of specific modes of operation, wherein various modes of operation incorporate different levels of autonomous control. Through a control user interface, an operator can move between certain modes of control even after vehicle deployment. Specialized autopilot system components and methods are employed to ensure smooth transitions between control modes. Empowered by the multi-modal control system, an operator can even manage multiple vehicles simultaneously.

 

[0014] Embodiments of the present invention pertain to a hierarchical control system, user interface system, and control architecture that together incorporate a broad range of user-selectable control modes representing variable levels of autonomy and vehicle control functionality. A unified autopilot is provided to process available modes and mode transitions. An intelligence synthesizer is illustratively provided to assist in resolving functional conflicts and transitioning between control modes, although certain resolutions and transitions can be incorporated directly into the functional sub-components associated with the different control modes. In accordance with one embodiment, all modes and transitions are funneled through an acceleration-based autopilot system. Accordingly, control commands and transitions are generally reduced to an acceleration vector to be processed by a centralized autopilot system.

 

[0085] As will be discussed in greater detail below, the control system and architecture embodiments of the present invention essentially enable any autopilot design to support control of a vehicle in numerous control modes that are executed with switches between modes during flight. All control modes are supported even in the presence of sensor errors, such as accelerometer and gyro biases. This robustness is at least partially attributable to the fact that the closed-loop system, in all control modes, is essentially slaved to an inertial path and, hence, the sensor biases wash out in the closed loop, assuming the biases are not so grossly large that they induce stability problems in the autopilot system. Furthermore, winds are generally not an issue in the overall control scheme in that the flight control system will regulate to the inertial path, adjusting for winds automatically in the closed loop. Given the precision afforded by inertial navigation aided by GPS technology, inertial path regulation offers a highly effective and robust UAV control approach. Generally speaking, the autopilot system functions such that winds, medium Dryden turbulence levels, sensor errors, airframe aerodynamic and mass model parameter uncertainties, servo non-linearity (slew rate limits, etc.), and various other atmospheric and noise disturbances will non have a critically negative impact on flight path regulation.

 

[0086] Component 408 receives commands generated by component 404 and filtered by autopilot component 406. The commands received by component 408 are executed to actually manipulate the vehicle's control surfaces. Autopilot component 406 then continues to monitor vehicle stabilization and/or command tracking, making additional commands to component 408 as necessary.

 

At the beginning of this subsection, the Examiner asserts, “Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising: …”

 

The Examiner’s statement, “However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising: …” is conclusory and is not supported by the Examiner’s citations to Duggan.

 

In addition, none of the Duggan citations teach that either synthetic vision or Duggan’s Variable Autonomy System is used “during at least selected phases of the flight of said unmanned aerial vehicle” which is a limitation in Applicant’s Claim 1.

 

Duggan fails to teach the limitation that his Variable Autonomy System is used during selected phases of a UAV’s flight and Margolin ‘724 fails to teach the limitation that synthetic vision is used during selected phases of a UAV’s flight. Therefore, the combination of Duggan and Margolin ‘724 does not read on Applicant’s Claim 1.

 

As cited above by Applicant, MPEP 2143.03 “All Claim Limitations must be Considered” states: “all words in a claim must be considered in judging the patentability of that claim against the prior art.” In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970).”

 

The Examiner has failed his duty under MPEP 2143.03 (and in view of Wehling) to present a prima facie case of obviousness for rejecting Applicant’s Claim 1.

 

Examiner’s Regarding Claim 2, a claim dependent on Claim 1. Applicant has shown that Claim 1 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 2 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Examiner’s Regarding Claim 3, a claim dependent on Claim 1. Applicant has shown that Claim 1 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 3 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Examiner’s Regarding Claim 4, a claim dependent on Claim 1. Applicant has shown that Claim 1 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 4 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Examiner:

 

Regarding claim 5, Margolin (abstract; figs. 1-7; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67) in view of Duggan disclose a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

(a)   a ground station equipped with a synthetic vision system;

(b)   an unmanned aerial vehicle capable of supporting said synthetic vision system;

(c)   a remote pilot operating said ground station;

(d)   a communications link between said unmanned aerial vehicle and said ground station;

 e)   a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot;

whereas said remote pilot uses said synthetic vision system to control said unmanned aerial vehicle during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system, and

           whereas the selected phases of the flight of said unmanned aerial vehicle comprise:

     (a)   when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

     (b)   when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

Applicant:

In Margolin ‘724: Column 3, lines 8-67; Column 4, lines 1-67; and Column 5, lines 1-67 form a continuous passage from Column 3, line 8 to Column 5, line 67. This passage of approximately 1619 words forms the core of the Margolin ‘724 DETAILED DESCRIPTION. The remainder of the Margolin ‘724 DETAILED DESCRIPTION teaches additional topics such as Flight Control (with headings Flight Control, Direct Control Non-Remotely Piloted Vehicles, Computer Mediated Non-Remotely Piloted Vehicles, Second Order Flight Control Mode, First Order Flight Control Mode {See Column 6, line 19 - Column 8, line 3}, the features of a Control Panel (See Column 8, line 64 - Column 9, line 18}, the use of a Head-Mounted Display {See Column 9, lines 19 - 32}, the use of the invention for training {See Column 9, lines 33 - 63}, and The Database {See Column 9, line 64 - Column 10, line 50.} 

 

The Examiner cites Figures 1 - 7 in Margolin ‘724. These constitute all the figures in Margolin ‘724.

 

The Examiner also cites the Abstract in Margolin ‘724. According to 608.01(b) Abstract of the Disclosure [R-7]:

37 CFR 1.72 Title and abstract.

*****

(b) A brief abstract of the technical disclosure in the specification must commence on a separate sheet, preferably following the claims, under the heading "Abstract" or "Abstract of the Disclosure." The sheet or sheets presenting the abstract may not include other parts of the application or other material. The abstract in an application filed under 35 U.S.C. 111 may not exceed 150 words in length. The purpose of the abstract is to enable the United States Patent and Trademark Office and the public generally to determine quickly from a cursory inspection the nature and gist of the technical disclosure.<

 

{Emphasis added}

 

The popular interpretation of 608.01(b) is that the purpose of the Abstract is to provide search terms. In any event, the Abstract in Margolin ‘724 does not say anything about civilian airspace.

 

The Examiner has made a conclusory statement by repeating the title of Applicant’s invention (leaving out the words “and method”) and citing the core of the DETAILED DESCRIPTION in Margolin ‘724.

 

In the remaining sections of the Examiner’s rejection of Applicant’s Claim 5 he asserts that he has found all of the elements and limitations of Applicant’s invention.

 

It is not surprising that some of the elements of Applicant’s invention are present in Margolin ‘724 since Margolin ‘724 is probably the pioneering patent for the use of what is now called synthetic vision in remotely piloted aircraft (now commonly called Unmanned Aerial Vehicles) and Applicant’s present invention uses synthetic vision as an element.

 

However, there are limitations in Applicant’s current invention that are not present in Margolin ‘724.

 

Examiner:

whereas said remote pilot uses said synthetic vision system to control said unmanned aerial vehicle during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system, and

 

whereas the selected phases of the flight of said unmanned aerial vehicle comprise:

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

The Examiner has not even attempted to show where these limitations are taught in Margolin ‘724. As noted, he has cited the core of the Margolin ‘724 DETAILED DESCRIPTION, all of the drawings, and the abstract. His rejection is purely conclusory and does not follow the requirements for making a prima facie rejection required by MPEP § 2143.03 All Claim Limitations Must Be Considered, KSR, and Wehling, as well as MPEP § 2142 ESTABLISHING A PRIMA FACIE CASE OF OBVIOUSNESS.

 

The Examiner continues:

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

a ground station controlling an unmanned aerial vehicle (sec. 0352, 00353), wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318, 0322, 0353) when a synthetic vision (sec. 0356, 0365, 0388, 0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system (autopilot, sec 0346 to 0350, 0390-0329).

Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to modify Margolin as taught by Duggan for the purpose of incorporating an autopilot to ensure smooth transitions (Duggna abstract, sec 0014, 0085, 0086).

The different embodiments in both prior arts are combinable as it would be obvious to ne having ordinary skill in the art.

Examiner

Duggan

 

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

a ground station controlling an unmanned aerial vehicle (sec. 0352,

 

 

00353),

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318,

 

 

 

 

 

 

 

 

0322,

 

 

 

 

 

 

 

 

 

 

 

0353)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

when a synthetic vision (sec. 0356,

 

0365,

 

 

 

 

 

 

 

 

 

 

0388,

 

 

 

 

 

 

 

 

0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

(autopilot, sec 0346 to 0350,

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

0390-0329).

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to modify Margolin as taught by Duggan for the purpose of incorporating an autopilot to ensure smooth transitions (Duggna abstract, sec 0014, 0085, 0086).

 

     The different embodiments in both prior arts are combinable as it would be obvious to ne[sic] having ordinary skill in the art.

 

 

[0352]   In one aspect of the present invention, an operator station (also referred to as the ground control station or GCS) is designed to accommodate command and control of multiple vehicles or a single vehicle by a single operator. In accordance with one embodiment, the ground control station is platform independent and implements an application program interface that provides windowing and communications interfaces (e.g., the platform is implemented in Open Source wxWindows API). The underlying operating system is illustratively masked and enables a developer to code in a high level environment.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0318] If the pilot chooses a surveillance location outside the total FOV, then the outer loop guidance will illustratively follow a command-to-LOS mode guide law until the UAV flight path points toward the target. Once the desired staring-point comes within a minimum range threshold, the guidance automatically trips into a loiter pattern (either constant-radius or elliptical) to maintain a station with a single key-click while he/she conducts other activities. FIGS. 22A & 22B together demonstrate the surveillance-point approach scenario.

 

[0322] In accordance with one aspect of the present invention, sensor-slave mode commands are generated by an autonomous line-of-sight driven function, in which the command objectives are generated by the necessities of the function rather than by an operator. For example, a function designed to command a raster-scan of a particular surveillance area, or a function designed to scan a long a roadway could be used to generate sensor slave commands. Another example is a function designed to generate line-of-sight commands for UAV-to-UAV rendezvous formation flying.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0356] a synthetic vision display

 

[0365] The two video monitors are illustratively used to display real-time data linked camera imagery from two air vehicles having cameras (of course, fewer, more or none of the vehicles might have cameras and the number of monitor displays can be altered accordingly). In accordance with one embodiment, camera imagery is recorded on videotapes during a mission. In accordance with one embodiment, the two repeater displays are used to provide redundant views of the GUI and synthetic vision display. The laptop illustratively serves as a GUI backup in the event that the main GUI fails.

 

[0388] In one aspect of the present invention, synthetic vision display technical approach of the present invention is based upon integrating advanced simulated visuals, originally developed for training purposes, into UAV operational systems. In accordance with one embodiment, the simulated visuals are integrated with data derived from the ground control station during flight to enable real-time synthetic visuals.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0346] In accordance with one embodiment, an exemplary translation layer implementation will now be provided. After the guidance algorithms execute, the outputs are translated to the native vehicle autopilot commands. The equations below provide example kinematic translations from the guidance acceleration commands to native vehicle autopilot commands. These equations demonstrate the principal that vehicle motion is activated through acceleration. The methods that various vehicles employ to generate acceleration are numerous (bank angle autopilot, acceleration autopilot, heading control autopilot, altitude control autopilot, etc). Since the control algorithms described herein generate acceleration commands that can be kinematically translated into any of these native autopilot commands, the guidance algorithms truly provide a generalized library of control laws that can control any vehicle through that vehicle's native atomic functions. Ubiquitous acceleration control techniques enable VACS to synthesize control commands for any vehicle, including air, ground, or sea-based. 35 a v = vertical plane acceleration command a h = horizontal plane acceleration command = tan - 1 ( a h a v ) = bank angle command a T = a v 2 + a h 2 = total body acceleration command . = a h V = turn rate command i = i - 1 + . t = heading command . = ( a v - g ) V = flight path rate command i = i - 1 + . t = flight path angle command h . = V sin ( ) = climb rate command h i = h i = 1 + h . t = altitude command Eq . 57

 

[0347] Additional functionality that can be enabled in a translation layer is means for discouraging or preventing an operator (e.g., the human or non-human operator interfacing the VACS architecture) from overdriving, stalling, or spinning the vehicle frame. This being said, limiting algorithms can also be employed in the guidance or autopilot functions.

[0348] X. Autopilot

[0349] As has been addressed, the present invention is not limited to, and does not require, a particular autopilot system. The control system and architecture embodiments of the present invention can be adapted to accommodate virtually any autopilot system.

[0350] For the purpose of providing an example, an illustrative suitable autopilot software system will now be described. The illustrative autopilot system incorporates a three-axis design (pitch and yaw with an attitude control loop in the roll axis) for vehicle stabilization and guidance command tracking. The autopilot software design incorporates flight control techniques, which allow vehicle control algorithms to dynamically adjust airframe stabilization parameters in real-time during flight. The flight computer is programmed directly with the airframe physical properties, so that it can automatically adjust its settings with changes in airframe configuration, aerodynamic properties, and/or flight state. This provides for a simple and versatile design, and possesses the critical flexibility needed when adjustments to the airframe configuration become necessary. The three-loop design includes angular rate feedback for stability augmentation, attitude feedback for closed-loop stiffness, and acceleration feedback for command tracking. In addition, an integral controller in the forward loop illustratively provides enhanced command tracking, low frequency disturbance rejection and an automatic trim capability.

 

{The Examiner may have meant 0390-0392. Otherwise the range is not credible}

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0391] The described Intelligent displays with smart variables represent an effective approach to actively displaying information for different types of vehicles. However, a problem can arise when a new vehicle is integrated into the ground control station with a completely foreign command and control interface. Under these circumstances, the ground control station is not concerned about displaying data, but is tasked to provide a command and control interface for the operator to perform the required operations. This conundrum is the motivation for another embodiment of the present invention, namely, the integration of vehicle specific panels in the ground control station.

 

[0392] In one embodiment, a generic vehicle class (GVC) is illustratively a software component that provides a rapid development environment API to add new vehicle classes and types to the ground control station. The GVC also illustratively serves as a software construct that allows the inclusion of multiple vehicles within the ground control station framework. One of the variables in the application is a vector of pointers to a generic vehicle class. This list is constructed by allocating new specific vehicles and returning a type case to the base generic vehicle class. When a new vehicle is integrated into the ground control station, the generic vehicle class provides all of the virtual functions to integrate with system control components (e.g., to integrate with a map display, a communications package, PCIG imagery and/or appropriate display windows). An important object in the application framework is illustratively a pointer to the current vehicle generic class. When the user switches vehicles, this pointer is updated and all displays grab the appropriate smart variables from the pointer to the new base class. This is the mechanism by which windows immediately update to the current vehicle information whenever the user switches vehicles. The default windows use the pointer to the current vehicle to grab information. In this manner, if the user switches to a new vehicle with a different set of datalink variables, that fact is immediately apparent on the display windows.

 

 

 

 

 

 

 

Abstract

 

Embodiments are disclosed for a vehicle control system and related sub-components that together provide an operator with a plurality of specific modes of operation, wherein various modes of operation incorporate different levels of autonomous control. Through a control user interface, an operator can move between certain modes of control even after vehicle deployment. Specialized autopilot system components and methods are employed to ensure smooth transitions between control modes. Empowered by the multi-modal control system, an operator can even manage multiple vehicles simultaneously.

 

[0014] Embodiments of the present invention pertain to a hierarchical control system, user interface system, and control architecture that together incorporate a broad range of user-selectable control modes representing variable levels of autonomy and vehicle control functionality. A unified autopilot is provided to process available modes and mode transitions. An intelligence synthesizer is illustratively provided to assist in resolving functional conflicts and transitioning between control modes, although certain resolutions and transitions can be incorporated directly into the functional sub-components associated with the different control modes. In accordance with one embodiment, all modes and transitions are funneled through an acceleration-based autopilot system. Accordingly, control commands and transitions are generally reduced to an acceleration vector to be processed by a centralized autopilot system.

 

[0085] As will be discussed in greater detail below, the control system and architecture embodiments of the present invention essentially enable any autopilot design to support control of a vehicle in numerous control modes that are executed with switches between modes during flight. All control modes are supported even in the presence of sensor errors, such as accelerometer and gyro biases. This robustness is at least partially attributable to the fact that the closed-loop system, in all control modes, is essentially slaved to an inertial path and, hence, the sensor biases wash out in the closed loop, assuming the biases are not so grossly large that they induce stability problems in the autopilot system. Furthermore, winds are generally not an issue in the overall control scheme in that the flight control system will regulate to the inertial path, adjusting for winds automatically in the closed loop. Given the precision afforded by inertial navigation aided by GPS technology, inertial path regulation offers a highly effective and robust UAV control approach. Generally speaking, the autopilot system functions such that winds, medium Dryden turbulence levels, sensor errors, airframe aerodynamic and mass model parameter uncertainties, servo non-linearity (slew rate limits, etc.), and various other atmospheric and noise disturbances will non have a critically negative impact on flight path regulation.

 

[0086] Component 408 receives commands generated by component 404 and filtered by autopilot component 406. The commands received by component 408 are executed to actually manipulate the vehicle's control surfaces. Autopilot component 406 then continues to monitor vehicle stabilization and/or command tracking, making additional commands to component 408 as necessary.

 

 

At the beginning of this subsection, the Examiner asserts, “Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising: …”

 

The Examiner’s statement, “However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising: …” is conclusory and is not supported by the Examiner’s citations to Duggan.

 

In addition, none of the Duggan citations teach the limitations in Applicant’s Claim 5 that either synthetic vision or Duggan’s Variable Autonomy System is used:

1.   “during at least selected phases of the flight of said unmanned aerial vehicle” 

2.    that the selected phases comprise:

(a)   when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

(b)   when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

Duggan fails to teach the limitation that his Variable Autonomy System is used during selected phases of a UAV’s flight and Margolin ‘724 fails to teach the limitation that synthetic vision is used during selected phases of a UAV’s flight. Therefore, the combination of Duggan and Margolin ‘724 does not read on Applicant’s Claim 5.

 

As cited above by Applicant, MPEP 2143.03 “All Claim Limitations must be Considered” states: “all words in a claim must be considered in judging the patentability of that claim against the prior art.” In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970).”

 

The Examiner has failed his duty under MPEP 2143.03 (and in view of Wehling) to present a prima facie case of obviousness for rejecting Applicant’s Claim 5.

 

Examiner’s Regarding Claim 6, a claim dependent on Claim 5. Applicant has shown that Claim 5 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 6 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Examiner’s Regarding Claim 7, a claim dependent on Claim 5. Applicant has shown that Claim 5 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 7 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Examiner:

 

Regarding claim 8, Margolin (abstract; figs. 1-7; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67) in view of Duggan disclose a method for safely flying an unmanned aerial vehicle as part of a unmanned aerial system equipped with a synthetic vision system in civilian airspace comprising the steps of‑

(a)   using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

(b)    providing a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot.

 

 

Applicant:

In Margolin ‘724: Column 3, lines 8-67; Column 4, lines 1-67; and Column 5, lines 1-67 form a continuous passage from Column 3, line 8 to Column 5, line 67. This passage of approximately 1619 words forms the core of the Margolin ‘724 DETAILED DESCRIPTION. The remainder of the Margolin ‘724 DETAILED DESCRIPTION teaches additional topics such as Flight Control (with headings Flight Control, Direct Control Non-Remotely Piloted Vehicles, Computer Mediated Non-Remotely Piloted Vehicles, Second Order Flight Control Mode, First Order Flight Control Mode {See Column 6, line 19 - Column 8, line 3}, the features of a Control Panel (See Column 8, line 64 - Column 9, line 18}, the use of a Head-Mounted Display {See Column 9, lines 19 - 32}, the use of the invention for training {See Column 9, lines 33 - 63}, and The Database {See Column 9, line 64 - Column 10, line 50.} 

 

The Examiner cites Figures 1 - 7 in Margolin ‘724. These constitute all the figures in Margolin ‘724.

 

The Examiner also cites the Abstract in Margolin ‘724. According to 608.01(b) Abstract of the Disclosure [R-7]:

37 CFR 1.72 Title and abstract.

*****

(b) A brief abstract of the technical disclosure in the specification must commence on a separate sheet, preferably following the claims, under the heading "Abstract" or "Abstract of the Disclosure." The sheet or sheets presenting the abstract may not include other parts of the application or other material. The abstract in an application filed under 35 U.S.C. 111 may not exceed 150 words in length. The purpose of the abstract is to enable the United States Patent and Trademark Office and the public generally to determine quickly from a cursory inspection the nature and gist of the technical disclosure.<

 

{Emphasis added}

 

The popular interpretation of 608.01(b) is that the purpose of the Abstract is to provide search terms. In any event, the Abstract in Margolin ‘724 does not say anything about civilian airspace.

 

The Examiner has made a conclusory statement by repeating the title of Applicant’s invention (leaving out the words “and method”) and citing the core of the DETAILED DESCRIPTION in Margolin ‘724.

 

In the remaining sections of the Examiner’s rejection of Applicant’s Claim 8 he asserts that he has found the elements and limitations of Applicant’s invention.

(a)   using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

(b)   providing a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot.

 

The Examiner has not even attempted to show where these limitations are taught in Margolin ‘724. He has particularly failed to show where the following is taught:

(a)   using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

 

As noted, he has cited the core of the Margolin ‘724 DETAILED DESCRIPTION, all of the drawings, and the abstract. His rejection is purely conclusory and does not follow the requirements for making a prima facie rejection required by MPEP § 2143.03 All Claim Limitations Must Be Considered, KSR, and Wehling, as well as MPEP § 2142 ESTABLISHING A PRIMA FACIE CASE OF OBVIOUSNESS.

 

The Examiner continues:

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

a ground station controlling an unmanned aerial vehicle (sec. 0352, 00353), wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318, 0322, 0353) when a synthetic vision (sec. 0356, 0365, 0388, 0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system (autopilot, sec 0346 to 0350, 0390-0329).

Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to modify Margolin as taught by Duggan for the purpose of incorporating an autopilot to ensure smooth transitions (Duggna abstract, sec 0014, 0085, 0086).

The different embodiments in both prior arts are combinable as it would be obvious to ne having ordinary skill in the art.

 

Examiner

Duggan

 

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

a ground station controlling an unmanned aerial vehicle (sec. 0352,

 

 

00353),

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318,

 

 

 

 

 

 

 

 

0322,

 

 

 

 

 

 

 

 

 

 

 

0353)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

when a synthetic vision (sec. 0356,

 

0365,

 

 

 

 

 

 

 

 

 

 

0388,

 

 

 

 

 

 

 

 

0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

(autopilot, sec 0346 to 0350,

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

0390-0329).

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to modify Margolin as taught by Duggan for the purpose of incorporating an autopilot to ensure smooth transitions (Duggna abstract, sec 0014, 0085, 0086).

 

     The different embodiments in both prior arts are combinable as it would be obvious to ne[sic] having ordinary skill in the art.

 

 

[0352]   In one aspect of the present invention, an operator station (also referred to as the ground control station or GCS) is designed to accommodate command and control of multiple vehicles or a single vehicle by a single operator. In accordance with one embodiment, the ground control station is platform independent and implements an application program interface that provides windowing and communications interfaces (e.g., the platform is implemented in Open Source wxWindows API). The underlying operating system is illustratively masked and enables a developer to code in a high level environment.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0318] If the pilot chooses a surveillance location outside the total FOV, then the outer loop guidance will illustratively follow a command-to-LOS mode guide law until the UAV flight path points toward the target. Once the desired staring-point comes within a minimum range threshold, the guidance automatically trips into a loiter pattern (either constant-radius or elliptical) to maintain a station with a single key-click while he/she conducts other activities. FIGS. 22A & 22B together demonstrate the surveillance-point approach scenario.

 

[0322] In accordance with one aspect of the present invention, sensor-slave mode commands are generated by an autonomous line-of-sight driven function, in which the command objectives are generated by the necessities of the function rather than by an operator. For example, a function designed to command a raster-scan of a particular surveillance area, or a function designed to scan a long a roadway could be used to generate sensor slave commands. Another example is a function designed to generate line-of-sight commands for UAV-to-UAV rendezvous formation flying.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0356] a synthetic vision display

 

[0365] The two video monitors are illustratively used to display real-time data linked camera imagery from two air vehicles having cameras (of course, fewer, more or none of the vehicles might have cameras and the number of monitor displays can be altered accordingly). In accordance with one embodiment, camera imagery is recorded on videotapes during a mission. In accordance with one embodiment, the two repeater displays are used to provide redundant views of the GUI and synthetic vision display. The laptop illustratively serves as a GUI backup in the event that the main GUI fails.

 

[0388] In one aspect of the present invention, synthetic vision display technical approach of the present invention is based upon integrating advanced simulated visuals, originally developed for training purposes, into UAV operational systems. In accordance with one embodiment, the simulated visuals are integrated with data derived from the ground control station during flight to enable real-time synthetic visuals.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0346] In accordance with one embodiment, an exemplary translation layer implementation will now be provided. After the guidance algorithms execute, the outputs are translated to the native vehicle autopilot commands. The equations below provide example kinematic translations from the guidance acceleration commands to native vehicle autopilot commands. These equations demonstrate the principal that vehicle motion is activated through acceleration. The methods that various vehicles employ to generate acceleration are numerous (bank angle autopilot, acceleration autopilot, heading control autopilot, altitude control autopilot, etc). Since the control algorithms described herein generate acceleration commands that can be kinematically translated into any of these native autopilot commands, the guidance algorithms truly provide a generalized library of control laws that can control any vehicle through that vehicle's native atomic functions. Ubiquitous acceleration control techniques enable VACS to synthesize control commands for any vehicle, including air, ground, or sea-based. 35 a v = vertical plane acceleration command a h = horizontal plane acceleration command = tan - 1 ( a h a v ) = bank angle command a T = a v 2 + a h 2 = total body acceleration command . = a h V = turn rate command i = i - 1 + . t = heading command . = ( a v - g ) V = flight path rate command i = i - 1 + . t = flight path angle command h . = V sin ( ) = climb rate command h i = h i = 1 + h . t = altitude command Eq . 57

 

[0347] Additional functionality that can be enabled in a translation layer is means for discouraging or preventing an operator (e.g., the human or non-human operator interfacing the VACS architecture) from overdriving, stalling, or spinning the vehicle frame. This being said, limiting algorithms can also be employed in the guidance or autopilot functions.

[0348] X. Autopilot

[0349] As has been addressed, the present invention is not limited to, and does not require, a particular autopilot system. The control system and architecture embodiments of the present invention can be adapted to accommodate virtually any autopilot system.

[0350] For the purpose of providing an example, an illustrative suitable autopilot software system will now be described. The illustrative autopilot system incorporates a three-axis design (pitch and yaw with an attitude control loop in the roll axis) for vehicle stabilization and guidance command tracking. The autopilot software design incorporates flight control techniques, which allow vehicle control algorithms to dynamically adjust airframe stabilization parameters in real-time during flight. The flight computer is programmed directly with the airframe physical properties, so that it can automatically adjust its settings with changes in airframe configuration, aerodynamic properties, and/or flight state. This provides for a simple and versatile design, and possesses the critical flexibility needed when adjustments to the airframe configuration become necessary. The three-loop design includes angular rate feedback for stability augmentation, attitude feedback for closed-loop stiffness, and acceleration feedback for command tracking. In addition, an integral controller in the forward loop illustratively provides enhanced command tracking, low frequency disturbance rejection and an automatic trim capability.

 

{The Examiner may have meant 0390-0392. Otherwise the range is not credible}

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0391] The described Intelligent displays with smart variables represent an effective approach to actively displaying information for different types of vehicles. However, a problem can arise when a new vehicle is integrated into the ground control station with a completely foreign command and control interface. Under these circumstances, the ground control station is not concerned about displaying data, but is tasked to provide a command and control interface for the operator to perform the required operations. This conundrum is the motivation for another embodiment of the present invention, namely, the integration of vehicle specific panels in the ground control station.

 

[0392] In one embodiment, a generic vehicle class (GVC) is illustratively a software component that provides a rapid development environment API to add new vehicle classes and types to the ground control station. The GVC also illustratively serves as a software construct that allows the inclusion of multiple vehicles within the ground control station framework. One of the variables in the application is a vector of pointers to a generic vehicle class. This list is constructed by allocating new specific vehicles and returning a type case to the base generic vehicle class. When a new vehicle is integrated into the ground control station, the generic vehicle class provides all of the virtual functions to integrate with system control components (e.g., to integrate with a map display, a communications package, PCIG imagery and/or appropriate display windows). An important object in the application framework is illustratively a pointer to the current vehicle generic class. When the user switches vehicles, this pointer is updated and all displays grab the appropriate smart variables from the pointer to the new base class. This is the mechanism by which windows immediately update to the current vehicle information whenever the user switches vehicles. The default windows use the pointer to the current vehicle to grab information. In this manner, if the user switches to a new vehicle with a different set of datalink variables, that fact is immediately apparent on the display windows.

 

 

 

 

 

 

 

Abstract

Embodiments are disclosed for a vehicle control system and related sub-components that together provide an operator with a plurality of specific modes of operation, wherein various modes of operation incorporate different levels of autonomous control. Through a control user interface, an operator can move between certain modes of control even after vehicle deployment. Specialized autopilot system components and methods are employed to ensure smooth transitions between control modes. Empowered by the multi-modal control system, an operator can even manage multiple vehicles simultaneously.

 

[0014] Embodiments of the present invention pertain to a hierarchical control system, user interface system, and control architecture that together incorporate a broad range of user-selectable control modes representing variable levels of autonomy and vehicle control functionality. A unified autopilot is provided to process available modes and mode transitions. An intelligence synthesizer is illustratively provided to assist in resolving functional conflicts and transitioning between control modes, although certain resolutions and transitions can be incorporated directly into the functional sub-components associated with the different control modes. In accordance with one embodiment, all modes and transitions are funneled through an acceleration-based autopilot system. Accordingly, control commands and transitions are generally reduced to an acceleration vector to be processed by a centralized autopilot system.

 

[0085] As will be discussed in greater detail below, the control system and architecture embodiments of the present invention essentially enable any autopilot design to support control of a vehicle in numerous control modes that are executed with switches between modes during flight. All control modes are supported even in the presence of sensor errors, such as accelerometer and gyro biases. This robustness is at least partially attributable to the fact that the closed-loop system, in all control modes, is essentially slaved to an inertial path and, hence, the sensor biases wash out in the closed loop, assuming the biases are not so grossly large that they induce stability problems in the autopilot system. Furthermore, winds are generally not an issue in the overall control scheme in that the flight control system will regulate to the inertial path, adjusting for winds automatically in the closed loop. Given the precision afforded by inertial navigation aided by GPS technology, inertial path regulation offers a highly effective and robust UAV control approach. Generally speaking, the autopilot system functions such that winds, medium Dryden turbulence levels, sensor errors, airframe aerodynamic and mass model parameter uncertainties, servo non-linearity (slew rate limits, etc.), and various other atmospheric and noise disturbances will non have a critically negative impact on flight path regulation.

 

[0086] Component 408 receives commands generated by component 404 and filtered by autopilot component 406. The commands received by component 408 are executed to actually manipulate the vehicle's control surfaces. Autopilot component 406 then continues to monitor vehicle stabilization and/or command tracking, making additional commands to component 408 as necessary.

 

At the beginning of this subsection, the Examiner asserts, “Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising: …”

 

The Examiner’s statement, “However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising: …” is conclusory and is not supported by the Examiner’s citations to Duggan.

 

In addition, none of the Duggan citations teach the limitations in Applicant’s Claim 8 that either synthetic vision or Duggan’s Variable Autonomy System comprises the step of:

(a)   using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

 

Duggan fails to teach the limitation that his Variable Autonomy System is used during selected phases of a UAV’s flight and Margolin ‘724 fails to teach the limitation that synthetic vision is used during selected phases of a UAV’s flight. Therefore, the combination of Duggan and Margolin ‘724 does not read on Applicant’s Claim 8.

 

As cited above by Applicant, MPEP 2143.03 “All Claim Limitations must be Considered” states: “all words in a claim must be considered in judging the patentability of that claim against the prior art.” In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970).”

 

The Examiner has failed his duty under MPEP 2143.03 (and in view of Wehling) to present a prima facie case of obviousness for rejecting Applicant’s Claim 8.

 

Examiner’s Regarding Claim 9, a claim dependent on Claim 8. Applicant has shown that Claim 8 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 9 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Examiner’s Regarding Claim 10, a claim dependent on Claim 8. Applicant has shown that Claim 8 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 10 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Examiner’s Regarding Claim 11, a claim dependent on Claim 8. Applicant has shown that Claim 8 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 11 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Examiner:

 

Regarding claim 12, Margolin (abstract; figs. 1-7; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67) in view of Duggan disclose a method for safely flying an unmanned aerial vehicle as part of a unmanned aerial system equipped with a synthetic vision system in civilian airspace comprising the steps of:

(a)   using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

(b)   providing a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot;

 

whereas said selected phases of the flight of said unmanned aerial vehicle comprise:

(a)   when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

(b) when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

Applicant:

In Margolin ‘724: Column 3, lines 8-67; Column 4, lines 1-67; and Column 5, lines 1-67 form a continuous passage from Column 3, line 8 to Column 5, line 67. This passage of approximately 1619 words forms the core of the Margolin ‘724 DETAILED DESCRIPTION. The remainder of the Margolin ‘724 DETAILED DESCRIPTION teaches additional topics such as Flight Control (with headings Flight Control, Direct Control Non-Remotely Piloted Vehicles, Computer Mediated Non-Remotely Piloted Vehicles, Second Order Flight Control Mode, First Order Flight Control Mode {See Column 6, line 19 - Column 8, line 3}, the features of a Control Panel (See Column 8, line 64 - Column 9, line 18}, the use of a Head-Mounted Display {See Column 9, lines 19 - 32}, the use of the invention for training {See Column 9, lines 33 - 63}, and The Database {See Column 9, line 64 - Column 10, line 50.} 

 

The Examiner cites Figures 1 - 7 in Margolin ‘724. These constitute all the figures in Margolin ‘724.

 

The Examiner also cites the Abstract in Margolin ‘724. According to 608.01(b) Abstract of the Disclosure [R-7]:

37 CFR 1.72 Title and abstract.

*****

(b) A brief abstract of the technical disclosure in the specification must commence on a separate sheet, preferably following the claims, under the heading "Abstract" or "Abstract of the Disclosure." The sheet or sheets presenting the abstract may not include other parts of the application or other material. The abstract in an application filed under 35 U.S.C. 111 may not exceed 150 words in length. The purpose of the abstract is to enable the United States Patent and Trademark Office and the public generally to determine quickly from a cursory inspection the nature and gist of the technical disclosure.<

 

{Emphasis added}

 

The popular interpretation of 608.01(b) is that the purpose of the Abstract is to provide search terms. In any event, the Abstract in Margolin ‘724 does not say anything about civilian airspace.

 

The Examiner has made a conclusory statement by repeating the title of Applicant’s invention (leaving out the words “and method”) and citing the core of the DETAILED DESCRIPTION in Margolin ‘724.

 

In the remaining sections of the Examiner’s rejection of Applicant’s Claim 8 he asserts that he has found the elements and limitations of Applicant’s invention.

 

(a)   using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

(b)   providing a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot;

 

whereas said selected phases of the flight of said unmanned aerial vehicle comprise:

(a)   when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

(b)   when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

The Examiner has not even attempted to show where these limitations are taught in Margolin ‘724. He has particularly failed to show where the following is taught:

(a)   using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

 

and

whereas said selected phases of the flight of said unmanned aerial vehicle comprise:

(a)   when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

(b)   when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

 As noted, he has cited the core of the Margolin ‘724 DETAILED DESCRIPTION, all of the drawings, and the abstract. His rejection is purely conclusory and does not follow the requirements for making a prima facie rejection required by MPEP § 2143.03 All Claim Limitations Must Be Considered, KSR, and Wehling, as well as MPEP § 2142 ESTABLISHING A PRIMA FACIE CASE OF OBVIOUSNESS.

 

The Examiner continues:

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

a ground station controlling an unmanned aerial vehicle (sec. 0352, 00353), wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318, 0322, 0353) when a synthetic vision (sec. 0356, 0365, 0388, 0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system (autopilot, sec 0346 to 0350, 0390-0329).

Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to modify Margolin as taught by Duggan for the purpose of incorporating an autopilot to ensure smooth transitions (Duggna abstract, sec 0014, 0085, 0086).

The different embodiments in both prior arts are combinable as it would be obvious to ne having ordinary skill in the art.

 

Examiner

Duggan

 

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

a ground station controlling an unmanned aerial vehicle (sec. 0352,

 

 

00353),

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318,

 

 

 

 

 

 

 

 

0322,

 

 

 

 

 

 

 

 

 

 

 

0353)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

when a synthetic vision (sec. 0356,

 

0365,

 

 

 

 

 

 

 

 

 

 

0388,

 

 

 

 

 

 

 

 

0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

(autopilot, sec 0346 to 0350,

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

0390-0329).

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to modify Margolin as taught by Duggan for the purpose of incorporating an autopilot to ensure smooth transitions (Duggna abstract, sec 0014, 0085, 0086).

 

     The different embodiments in both prior arts are combinable as it would be obvious to ne[sic] having ordinary skill in the art.

 

 

[0352]   In one aspect of the present invention, an operator station (also referred to as the ground control station or GCS) is designed to accommodate command and control of multiple vehicles or a single vehicle by a single operator. In accordance with one embodiment, the ground control station is platform independent and implements an application program interface that provides windowing and communications interfaces (e.g., the platform is implemented in Open Source wxWindows API). The underlying operating system is illustratively masked and enables a developer to code in a high level environment.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0318] If the pilot chooses a surveillance location outside the total FOV, then the outer loop guidance will illustratively follow a command-to-LOS mode guide law until the UAV flight path points toward the target. Once the desired staring-point comes within a minimum range threshold, the guidance automatically trips into a loiter pattern (either constant-radius or elliptical) to maintain a station with a single key-click while he/she conducts other activities. FIGS. 22A & 22B together demonstrate the surveillance-point approach scenario.

 

[0322] In accordance with one aspect of the present invention, sensor-slave mode commands are generated by an autonomous line-of-sight driven function, in which the command objectives are generated by the necessities of the function rather than by an operator. For example, a function designed to command a raster-scan of a particular surveillance area, or a function designed to scan a long a roadway could be used to generate sensor slave commands. Another example is a function designed to generate line-of-sight commands for UAV-to-UAV rendezvous formation flying.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0356] a synthetic vision display

 

[0365] The two video monitors are illustratively used to display real-time data linked camera imagery from two air vehicles having cameras (of course, fewer, more or none of the vehicles might have cameras and the number of monitor displays can be altered accordingly). In accordance with one embodiment, camera imagery is recorded on videotapes during a mission. In accordance with one embodiment, the two repeater displays are used to provide redundant views of the GUI and synthetic vision display. The laptop illustratively serves as a GUI backup in the event that the main GUI fails.

 

[0388] In one aspect of the present invention, synthetic vision display technical approach of the present invention is based upon integrating advanced simulated visuals, originally developed for training purposes, into UAV operational systems. In accordance with one embodiment, the simulated visuals are integrated with data derived from the ground control station during flight to enable real-time synthetic visuals.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0346] In accordance with one embodiment, an exemplary translation layer implementation will now be provided. After the guidance algorithms execute, the outputs are translated to the native vehicle autopilot commands. The equations below provide example kinematic translations from the guidance acceleration commands to native vehicle autopilot commands. These equations demonstrate the principal that vehicle motion is activated through acceleration. The methods that various vehicles employ to generate acceleration are numerous (bank angle autopilot, acceleration autopilot, heading control autopilot, altitude control autopilot, etc). Since the control algorithms described herein generate acceleration commands that can be kinematically translated into any of these native autopilot commands, the guidance algorithms truly provide a generalized library of control laws that can control any vehicle through that vehicle's native atomic functions. Ubiquitous acceleration control techniques enable VACS to synthesize control commands for any vehicle, including air, ground, or sea-based. 35 a v = vertical plane acceleration command a h = horizontal plane acceleration command = tan - 1 ( a h a v ) = bank angle command a T = a v 2 + a h 2 = total body acceleration command . = a h V = turn rate command i = i - 1 + . t = heading command . = ( a v - g ) V = flight path rate command i = i - 1 + . t = flight path angle command h . = V sin ( ) = climb rate command h i = h i = 1 + h . t = altitude command Eq . 57

 

[0347] Additional functionality that can be enabled in a translation layer is means for discouraging or preventing an operator (e.g., the human or non-human operator interfacing the VACS architecture) from overdriving, stalling, or spinning the vehicle frame. This being said, limiting algorithms can also be employed in the guidance or autopilot functions.

[0348] X. Autopilot

[0349] As has been addressed, the present invention is not limited to, and does not require, a particular autopilot system. The control system and architecture embodiments of the present invention can be adapted to accommodate virtually any autopilot system.

[0350] For the purpose of providing an example, an illustrative suitable autopilot software system will now be described. The illustrative autopilot system incorporates a three-axis design (pitch and yaw with an attitude control loop in the roll axis) for vehicle stabilization and guidance command tracking. The autopilot software design incorporates flight control techniques, which allow vehicle control algorithms to dynamically adjust airframe stabilization parameters in real-time during flight. The flight computer is programmed directly with the airframe physical properties, so that it can automatically adjust its settings with changes in airframe configuration, aerodynamic properties, and/or flight state. This provides for a simple and versatile design, and possesses the critical flexibility needed when adjustments to the airframe configuration become necessary. The three-loop design includes angular rate feedback for stability augmentation, attitude feedback for closed-loop stiffness, and acceleration feedback for command tracking. In addition, an integral controller in the forward loop illustratively provides enhanced command tracking, low frequency disturbance rejection and an automatic trim capability.

 

{The Examiner may have meant 0390-0392. Otherwise the range is not credible}

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

[0391] The described Intelligent displays with smart variables represent an effective approach to actively displaying information for different types of vehicles. However, a problem can arise when a new vehicle is integrated into the ground control station with a completely foreign command and control interface. Under these circumstances, the ground control station is not concerned about displaying data, but is tasked to provide a command and control interface for the operator to perform the required operations. This conundrum is the motivation for another embodiment of the present invention, namely, the integration of vehicle specific panels in the ground control station.

 

[0392] In one embodiment, a generic vehicle class (GVC) is illustratively a software component that provides a rapid development environment API to add new vehicle classes and types to the ground control station. The GVC also illustratively serves as a software construct that allows the inclusion of multiple vehicles within the ground control station framework. One of the variables in the application is a vector of pointers to a generic vehicle class. This list is constructed by allocating new specific vehicles and returning a type case to the base generic vehicle class. When a new vehicle is integrated into the ground control station, the generic vehicle class provides all of the virtual functions to integrate with system control components (e.g., to integrate with a map display, a communications package, PCIG imagery and/or appropriate display windows). An important object in the application framework is illustratively a pointer to the current vehicle generic class. When the user switches vehicles, this pointer is updated and all displays grab the appropriate smart variables from the pointer to the new base class. This is the mechanism by which windows immediately update to the current vehicle information whenever the user switches vehicles. The default windows use the pointer to the current vehicle to grab information. In this manner, if the user switches to a new vehicle with a different set of datalink variables, that fact is immediately apparent on the display windows.

 

 

 

 

 

 

 

Abstract

Embodiments are disclosed for a vehicle control system and related sub-components that together provide an operator with a plurality of specific modes of operation, wherein various modes of operation incorporate different levels of autonomous control. Through a control user interface, an operator can move between certain modes of control even after vehicle deployment. Specialized autopilot system components and methods are employed to ensure smooth transitions between control modes. Empowered by the multi-modal control system, an operator can even manage multiple vehicles simultaneously.

 

[0014] Embodiments of the present invention pertain to a hierarchical control system, user interface system, and control architecture that together incorporate a broad range of user-selectable control modes representing variable levels of autonomy and vehicle control functionality. A unified autopilot is provided to process available modes and mode transitions. An intelligence synthesizer is illustratively provided to assist in resolving functional conflicts and transitioning between control modes, although certain resolutions and transitions can be incorporated directly into the functional sub-components associated with the different control modes. In accordance with one embodiment, all modes and transitions are funneled through an acceleration-based autopilot system. Accordingly, control commands and transitions are generally reduced to an acceleration vector to be processed by a centralized autopilot system.

 

[0085] As will be discussed in greater detail below, the control system and architecture embodiments of the present invention essentially enable any autopilot design to support control of a vehicle in numerous control modes that are executed with switches between modes during flight. All control modes are supported even in the presence of sensor errors, such as accelerometer and gyro biases. This robustness is at least partially attributable to the fact that the closed-loop system, in all control modes, is essentially slaved to an inertial path and, hence, the sensor biases wash out in the closed loop, assuming the biases are not so grossly large that they induce stability problems in the autopilot system. Furthermore, winds are generally not an issue in the overall control scheme in that the flight control system will regulate to the inertial path, adjusting for winds automatically in the closed loop. Given the precision afforded by inertial navigation aided by GPS technology, inertial path regulation offers a highly effective and robust UAV control approach. Generally speaking, the autopilot system functions such that winds, medium Dryden turbulence levels, sensor errors, airframe aerodynamic and mass model parameter uncertainties, servo non-linearity (slew rate limits, etc.), and various other atmospheric and noise disturbances will non have a critically negative impact on flight path regulation.

 

[0086] Component 408 receives commands generated by component 404 and filtered by autopilot component 406. The commands received by component 408 are executed to actually manipulate the vehicle's control surfaces. Autopilot component 406 then continues to monitor vehicle stabilization and/or command tracking, making additional commands to component 408 as necessary.

 

 

At the beginning of this subsection, the Examiner asserts, “Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising: …”

 

The Examiner’s statement, “However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising: …” is conclusory and is not supported by the Examiner’s citations to Duggan.

 

In addition, none of the Duggan citations teach the limitations in Applicant’s Claim 12 that either synthetic vision or Duggan’s Variable Autonomy System comprises the step of:

(a)   using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

and

whereas said selected phases of the flight of said unmanned aerial vehicle comprise:

(a)   when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

(b)   when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

Duggan fails to teach the limitation that his Variable Autonomy System is used during selected phases of a UAV’s flight and Margolin ‘724 fails to teach the limitation that synthetic vision is used during selected phases of a UAV’s flight. Therefore, the combination of Duggan and Margolin ‘724 does not read on Applicant’s Claim 12.

 

As cited above by Applicant, MPEP 2143.03 “All Claim Limitations must be Considered” states: “all words in a claim must be considered in judging the patentability of that claim against the prior art.” In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970).”

 

The Examiner has failed his duty under MPEP 2143.03 (and in view of Wehling) to present a prima facie case of obviousness for rejecting Applicant’s Claim 12.

 

Examiner’s Regarding Claim 13, a claim dependent on Claim 12. Applicant has shown that Claim 12 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 13 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Examiner’s Regarding Claim 14, a claim dependent on Claim 12. Applicant has shown that Claim 12 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 14 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Part B - The Present Applicant is the named inventor on 5,904,724.

The present Applicant (Jed Margolin) is the named inventor on U.S. Patent 5,904,724. See the attached DECLARATION OF JED MARGOLIN.  The Examiner is barred from citing ‘724 as prior art in a 35 U.S.C. §103 rejection. See ISCO INTERN v. Conductus, Inc, 279 F.Supp.2d 489 (D.Del. 2003) Footnote 4:

[4]  Although § 102 relates to prior invention by another, anticipation, and abandonment, its standard for determining prior art is applied to the § 103 obviousness inquiry as well. See, e.g., Panduit Corp. v. Dennison Mfg. Co., 810 F.2d 1561, 1568 (Fed.Cir.1987), cert. denied, 481 U.S. 1052, 107 S.Ct. 2187, 95 L.Ed.2d 843 (1987) ("Before answering Graham's `content' inquiry, it must be known whether a patent or publication is in the prior art under 35 U.S.C. § 102.") (citing Graham v. John Deere Co., 383 U.S. 1, 86 S.Ct. 684, 15 L.Ed.2d 545 (1966)); Ex parte Andresen, 212 U.S.P.Q. 100, 102 (Pat.& Tr. Office Bd.App. 1981) (citing congressional committee record and commentary and concluding that Congress intended § 103 to "includ[e] all of the various bars to a patent as set forth in section 102").

 

As MPEP 2129 explains, “However, even if labeled as "prior art," the work of the same inventive entity may not be considered prior art against the claims unless it falls under one of the statutory categories.”

2129 Admissions as Prior Art [R-6]

I.    ADMISSIONS BY APPLICANT CONSTI-TUTE PRIOR ART

A statement by an applicant >in the specification or made< during prosecution identifying the work of another as "prior art" is an admission **>which can be relied upon for both anticipation and obviousness determinations, regardless of whether the admitted prior art would otherwise qualify as prior art under the statutory categories of 35 U.S.C. 102. Riverwood Int'l Corp. v. R.A. Jones & Co., 324 F.3d 1346, 1354, 66 USPQ2d 1331, 1337 (Fed. Cir. 2003); Constant v. Advanced Micro-Devices Inc., 848 F.2d 1560, 1570, 7 USPQ2d 1057, 1063 (Fed. Cir. 1988).< However, even if labeled as "prior art," the work of the same inventive entity may not be considered prior art against the claims unless it falls under one of the statutory categories. Id.; see also Reading & Bates Construction Co. v. Baker Energy Resources Corp., 748 F.2d 645, 650, 223 USPQ 1168, 1172 (Fed. Cir. 1984) ("[W]here the inventor continues to improve upon his own work product, his foundational work product should not, without a statutory basis, be treated as prior art solely because he admits knowledge of his own work. It is common sense that an inventor, regardless of an admission, has knowledge of his own work.").

Consequently, the examiner must determine whether the subject matter identified as "prior art" is applicant's own work, or the work of another. In the absence of another credible explanation, examiners should treat such subject matter as the work of another.

 

Part D - Applicant’s invention meets a long felt but unmet need.

According to the article NASA Plans UAS Push (Exhibit 1 at 81):

NASA is seeking industry feedback on its plans for a new five-year, $150-million program to help integrate unmanned aircraft into civil airspace. The feedback is likely to be mixed, as the agency's last major unmanned aircraft research program was canceled before it got off the ground, despite industry backing.

 

Briefed to industry experts in early August, the Unmanned Air Systems (UAS) Integration in the National Airspace System (NAS) project is planned to begin in Fiscal 2011. It would be NASAs first major unmanned aircraft effort since the High-Altitude Long-Endurance Remotely Operated Aircraft (HALE ROA) project was killed in 2005.

 

The new program would focus on separation assurance and collision avoidance, pilot-aircraft interface, certification requirements and communications, involving a series of increasingly complex flight demonstrations. The main goal is to generate data to help the FAA and standards organizations develop guidelines and regulations for the design and operation of UASs in the NAS. The research is expected to have an impact in the 2015-25 timeframe.

 

Applicant’s invention solves a long-felt unmet need to safely fly UAVs in civilian airspace. (See MPEP 716.04 Long-Felt Need and Failure of Others.) Otherwise it would not be necessary for NASA to set up “a new five-year, $150-million program to help integrate unmanned aircraft into civilian airspace.”

 

Part E - The Duggan Application.

The Examiner’s choice of Duggan Patent Application US 2005004723 as a reference is interesting. By a coincidence Applicant (“Margolin”) discovered the Duggan Application not long after the

USPTO published it.

 

Margolin analyzed the Dugan claims and found some deficiencies. For example, Duggan Claim 1:

 

1. A computer-implemented method for providing an operator of a vehicle with a plurality of control modes, wherein the system is configured to support transitioning between control modes during operation of the vehicle, the method comprising: receiving a first operator input that corresponds to a first control mode; generating a first directional representation of the first operator input; processing the first directional representation through a unified autopilot system so as to generate a first control output; mechanically adjusting a control component associated with the vehicle based on the first control output; receiving a second operator input that corresponds to a request to transition from the first control mode to a second control mode; transitioning from the first control mode to the second control mode; receiving a third operator input that corresponds to the second control mode; generating a second directional representation of the third operator input; processing the second directional representation through the unified autopilot system so as to generate a second control output; and mechanically adjusting a control component associated with the vehicle based on the second control output.

 

{Emphasis added}

 

This claims a method where the operator of a vehicle is able to select two or more control modes and the system transitions between them. The claim does not say how the system transitions between them other than that the autopilot does it. The term “directional representation” does not appear in the Specification. What is the “directional representation” of an operator input? Common English usage suggests that it is the line or course along which the operator moves the joystick or mouse. Also, by definition an autopilot mechanically adjusts control components so this part of the claim is redundant.

 

Duggan’s Dependent claim 2 is redundant. Duggan’s Claim 1 already specifies the use of a unified autopilot.

2. The method of claim 1, wherein said transitioning comprises processing a transition command through the unified autopilot system.

 

Duggan Dependent claim 3:

3. The method of claim 1, wherein generating a first directional representation comprises generating a first set of acceleration and bank angle commands.

 

Finally, something real. A directional representation can be a set of acceleration and bank angle commands. What else can a “directional representation” be? Duggan does not teach it, so Claim 1 is indistinct.

 

Even so, this may have already been done. For example see U.S. Patent 4,155,525 Maneuver detector circuit for use in autothrottle control systems having thrust and flight path control decoupling issued May 22, 1979 to Peter-Contesse (assigned to Boeing). From Column 1, lines 15-28:

It is an object of this invention to provide a flight control system having thrust and flight path control decoupling utilizing maneuver detector and limited integrator circuit means in lieu of the aforementioned time-constant programmer circuit means.


It is yet another object of this invention to provide circuit means responsive to elevator, normal acceleration, and pitch attitude signals for providing a signal having a first predetermined polarity when a purposeful maneuver of the aircraft is effected and a further signal having a polarity opposite to said first predetermined polarity when a non-maneuver is indicated, a purposeful maneuver being defined as one initiated by the pilot as contrasted to non-pilot initiated aircraft maneuvers.

 

There is also U.S. Patent 6,062,513 Total energy based flight control system issued May 16, 2000 to Lambregts (also assigned to Boeing). From Column 6, line 65 - Column 7, line 14:

The present invention modifies the known TEC system by using an alternate control strategy and flight path command .gamma..sub.C processing scheme. This alternate strategy is used during manual control mode (using a control column or the like) when the thrust has been driven to a preset value (such as a maximum or minimum thrust limit) or when the automatic throttle is disengaged. Under these circumstances, instead of reverting to a pure path priority scheme for stick or control column inputs (by opening switch 30 and letting the airspeed increase or decreases until a speed limit is reached as is done in the known TEC system), the present invention transitions to a combined speed and path priority scheme, where flight path angle is the short term control priority and the set speed command is the long term priority. In this scheme, switch 30' remains closed and the normal speed control feedback is continued after thrust reaches a limit.

 

Duggan Claim 31:

31. A multi-modal variable autonomy control system, the system comprising:

 

a plurality of control mode components each corresponding to a different mode of control and being configured to respond to command inputs by generating directionally descriptive control commands; and

 

a unified autopilot component for processing said directionally descriptive control commands.

 

an vehicle control component for receiving processed commands from the unified autopilot system and actuating control devices accordingly.

 

This claim contains inexcusable punctuation errors. These errors were not introduced by the Patent Office; they are in the Application in the File Wrapper. See Exhibit 2 at 83.

 

Margolin gave his analysis to Optima Technology, Inc. (now Optima Technology Group) who was then acting as Margolin’s agent for selling or licensing his patents. Optima contacted Geneva Aerospace, the assignee of the Duggan application.

 

Geneva responded by filing a Supplemental IDS listing all of Margolin’s patents (even though only 5,566,073 and 5,904,724 were relevant), U.S. Patents 4,155,525 and 6,062,513, along with some of the non-patent literature that Margolin had presented, such as:

 

Beringer, D.; Applying Performance-Controlled Systems, Fuzzy Logic, and Fly-By-Wire Controls to General Aviation, Office of Aerospace Medicine, May 2002.

 

Abernathy, M.; “Virtual Cockpit Windowfor a Windowless Aerospacecraft. http://www.nasatech.com/Briefs/Jan03/MSC23096.html  Jan. 2003.

 

See Exhibit 2 at 84-88.

 

Geneva also licensed Margolin Patents 5,566,073 and 5,904,724. See Exhibit 3 at 91.

 

It came as a complete surprise to Applicant when the Duggan Application was allowed as filed (despite its defects) in the FOAM. Geneva’s attorneys may have been surprised as well. They had to ask the Duggan Examiner to correct the punctuation errors in Duggan Claim 31. See Exhibit 2 at 89.

 

Perhaps the Duggan Examiner was preoccupied with financial problems. See Exhibit 4 at 109. But where were the Second Set of Eyes? Perhaps they were sleeping that day.

 

Margolin wishes to note that the Examiner in the present case cited the Duggan Application even though it had already issued as U.S. Patent 7,343,232 (‘232) Vehicle control system including related methods and components on March 11, 2008.

 

The Duggan Application may have other problems as well. The Duggan Application claims priority from Provisional Application Ser. No. 60/480,192, filed Jun. 20, 2003. According to 35 U.S.C. 102 Conditions for patentability; novelty and loss of right to patent.

A person shall be entitled to a patent unless -

*****

(b) the invention was patented or described in a printed publication in this or a foreign country or in public use or on sale in this country, more than one year prior to the date of application for patent in the United States.

 

There is evidence that this might have occurred. The paper UCAV Distributed Mission Training Testbed: Lessons Learned and Future Challenges by Dr. Dutch Guckenberger and Matt Archer; The Interservice/Industry Training, Simulation & Education Conference (I/ITSEC), Volume: 2000 (Conference Theme: Partnerships for Learning in the New Millennium) was presented at the I/ITSEC Conference in 2000. The title page and page 7 are reproduced in Exhibit 5 at 180. On document page 7 (Exhibit 5 at 183), under the heading Variable Autonomy Control System (VACS) it refers to Geneva Aerospace’s Variable Autonomy Control System:

 

As a portion of the DMT UCAV Testbed development, the Geneva AeroSpace Variable Autonomy Control System (VACS) was added to LiteFlite. The VACS is designed to be effective for UAV and UCAV systems as usable to individuals whose training is focused on the requirements of a given mission or the usability of the payload, rather than on the aviation of the vehicle. As the dependence on UAVs for military operations grows and UAV technology is integrated into the emerging global command and control architecture, the cost and complexity of managing and controlling these assets can easily become substantial. The VACS solution to this UAV control problem lies in the appropriate functional allocation between the human and the machine. By merging modern stand-off missile flight control, advanced aircraft flight control, and state-of-the-art communications technologies, Geneva has developed a novel hierarchical flight control structure with varied levels of remote operator input to address the human-machine functional allocation problem.

 

The VACS has been successfully demonstrated enabling a diverse range of users to effectively operate UAVs. Furthermore, the VACS solution eliminates the requirement for UAVs to be controlled by highly trained, rated pilots. In a continuing development and demonstration effort VACS is to be used Joint STARS MTE workstation and the Freewing Scorpion 100-50 UAV and conduct a flight test demonstration. This program will demonstrate the benefits of the variable autonomy flight control system design with simplified manual control modes, demonstrate the compatibility of such a system with the military s emerging C4I architecture, and demonstrate the synergism between Joint STARS and UAVs using the simplified UAV flight control technology.

 

{Emphasis added}

 

Geneva Aerospace filed a trademark application with the USPTO on 1/22/2004 for the trademark “Variable Autonomy Control System.” See Exhibit 6 at 185. In the application Geneva Aerospace declared, under penalty of perjury:

 

The applicant, or the applicant's related company or licensee, is using the mark in commerce, and lists below the dates of use by the applicant, or the applicant's related company, licensee, or predecessor in interest, of the mark on or in connection with the identified goods and/or services. 15 U.S.C. Section 1051(a), as amended.

 

International Class 009: computer software for autonomous aerial vehicle guidance and control systems

 

In International Class 009, the mark was first used at least as early as 09/01/1998, and first used in commerce at least as early as 09/01/1998, and is now in use in such commerce. The applicant is submitting or will submit one specimen for each class showing the mark as used in commerce on or in connection with any item in the class of listed goods and/or services, consisting of a(n) Portion of company website describing product.

 

{Emphasis added}

 

The mark “Variable Autonomy Control System” is for “computer software for autonomous aerial vehicle guidance and control systems”.

 

Geneva declares that the “Variable Autonomy Control System” was first used in commerce as early as 09/01/1998, which is more than one year prior to the 6/20/2003 filing date of the provisional application.

 

Is the “Variable Autonomy Control System” in the Duggan ‘232 patent the same “Variable Autonomy Control System” that Geneva wished to trademark? Their trademark application included a portion of the company website describing the product, which states (Exhibit 6 at 188):

 

Products: Variable Autonomy Control System (VACS)TM

 

Under Air Force Research Lab funding Geneva has developed an innovative UAV control design that combines state-of-the-art missile technologies with fixed-wing aircraft control. Our design balances autonomous flight control With manual control to provide variable levels of directional independence and minimizes the personnel and training requirements for the operation of the UAV, The truly enabled UAV operator is not required to be a trained aviator, but still retains a wide range of control flexibility in order to successfully execute the mission objectives that call upon his/her specialized expertise.

 

Our solution is a hierarchical flight control structure with multiple levels of remote operator input combined with an off-board controller software package and intuitive human system interface. Research of the UAV control problem has indicated that the best solution lies in the appropriate functional allocation between the human and the machine, leading to the organization of the control problem between the two fundamental categories: flight governance and flight management.

 

{Emphasis added}

 

It sounds like it is.

 

Therefore, the Duggan ‘232 patent is invalid for failing to meet the requirements of 35 U.S.C 102.

 

Note that the Duggan “Variable Autonomy Control System” was developed under Air Force Research Lab funding. That would give the Government certain patent rights in the invention. This is not stated in the Duggan ‘232 patent.

 

Geneva also filed an application to trademark “VCAS”. They made the same declaration as they did for “Variable Autonomy Control System” and included the same company website page. See Exhibit 7 at 190.

 

Dave Duggan of Geneva Aerospace and Luis A. Piñeiro of AFRL presented a paper at the 2002 AUVSI Symposium. The paper from the Proceedings is reproduced as Exhibit 8 at 195. From Exhibit 8 at 196, last paragraph under the heading VACS Overview:

 

Funding for the variable autonomy control concept was provided under the Small Business Innovative Research (SBIR) program Phase I, Phase II, and Phase III funding vehicles through the Air Force Research Laboratory (AFRL) Human Effectiveness and Air Vehicles Integration Directorates (Reference 1).

 

Reference 1 says:

1.  Duggan, David S., “Demonstration of an Integrated Variable Autonomy UAV Flight Control System”, Phase II SBIR Final Report, AFRL-HE-WP-TR-2001-0035, January 2001

 

Applicant has not been able to obtain this reference from DTIC.

 

However, Duggan/Geneva Aerospace’s Provisional Application (Application Number 60/480,192) contains Geneva Aerospace’s Small Business Innovation Research (SBIR) Program Projects Summary, Topic Number AF98-179 (Exhibit 9 at 211), which shows that Geneva Aerospace had the invention described in ‘232 in its possession as early as the date the SBIR Project Summary for AF98-179 was submitted. According to the Air Force SBIR Web site at http://www.afsbirsttr.com/TechMall/Default.aspx?kwa=AF98-179 the SBIR Phase I Contract started 5/14/1998, ended 2/14/1999, and the date of the DTIC report is 3/20/2001. See Exhibit 10 at 235.

 

This suggests that Geneva Aerospace was being truthful in their Trademark Applications, that the products named Variable Autonomy Control Systems and VACS were first used commercially as early as 09/01/1998.

 

The ‘232 patent claims priority from Provisional Application 60/480,192 filed June 20, 2003 and incorporates the Provisional Application in its entirety in the ‘232 patent. See ‘232 Column 1, lines 6 - 9.  However, Provisional Application 60/480,192 was not made available to the public on PAIR until November 22, 2010. See Margolin Declaration § 14. As a result, the public was not able to read the entire ‘232 patent until November 22, 2010.

 

The Duggan Provisional Application contains an Information Disclosure Statement (PTO-1449), filed July 29, 2004 listing a number of patent references. See Exhibit 11 at 237. With the exception of U.S. Patent 5,904,724 none of the other patent references are listed on the ‘232 patent. And, with the exception of 5,904,724 none of the references cited by Duggan in his Provisional Application are marked as having been considered by the Duggan Examiner.

 

The irregularities surrounding the ‘232 patent would call for an investigation by the USPTO’s Inspector General, but the USPTO does not seem to have an Inspector General.

 

Section 3.

 

For the foregoing reasons, Applicant submits that all objections and rejections have been overcome. Applicant requests that the rejection of pending claims 1-14 be withdrawn and that the application be allowed as filed.

 

Respectfully submitted,

 

/Jed Margolin/                        Date: November 29, 2010

Jed Margolin

 

 

Jed Margolin

1981 Empire Rd.

Reno, NV  89521-7430

(775) 847-7845

 

______________________________________________________________________

 


IN THE UNITED STATES PATENT AND TRADEMARK OFFICE

 

In re Application of Jed Margolin    

Serial No.: 11/736,356                                                                       Examiner: Ronnie M. Mancho

Filed: 04/17/2007                                                                               Art Unit: 3664

For: SYSTEM AND METHOD FOR SAFELY FLYING UNMANNED AERIAL VEHICLES

        IN CIVILIAN AIRSPACE

 

DECLARATION OF JED MARGOLIN

 

I, Jed Margolin, declare as follows:

 

1.  I am the Applicant in the above patent application.

 

2.  I am the named inventor (Jed Margolin) on U.S. Patent 5,904,724 Method and apparatus for remotely piloting an aircraft issued May 18, 1999.

 

3.  Exhibit 1 is a true and accurate reproduction of the article NASA Plans UAS Push by Graham Warwick that appeared in Aviation Week & Space Technology, August 16, 2010, page 13.

 

4.  Exhibit 2 is a true and accurate reproduction of documents from the image filewrapper for the Duggan Application 10/871,612 that I downloaded from the USPTO’s PAIR Web site on or about November 1, 2010.

 

5.  Exhibit 3 is a true and accurate reproduction of the License Agreement between Geneva Aerospace, Optima Technology, Inc., and myself. I have redacted financial information as per Federal Rules of Civil Procedure Rule 5.2. I have also redacted other sensitive information. (Note that Optima Technology, Inc. subsequently changed their name to Optima Technology Group.)

 

6.  Exhibit 4 is a true and accurate reproduction of public documents that I downloaded from the Palm Beach County, Florida Web site at http://oris.co.palm-beach.fl.us/or_web1/or_sch_1.asp between approximately August 30, 2010 and September 13, 2010.    

 

7.  Exhibit 5 is a true and accurate reproduction of the Web page that I downloaded from   http://ntsa.metapress.com/link.asp?id=4mrrc0aupmjpf8e6 on or about November 16, 2010, showing the availability of the paper Lessons Learned and Future Challenges by Dr. Dutch Guckenberger and Matt Archer presented at the 2000 Interservice/Industry Training, Simulation & Education Conference (I/ITSEC), and part of Volume: 2000 (Conference Theme: Partnerships for Learning in the New Millennium, followed by the title page and the seventh page from the paper that I purchased from Meta Press on or about November 16, 2010.

 

8.  Exhibit 6 is a true and accurate reproduction of documents filed by Geneva Aerospace in Trademark Application, Serial Number 78355947 for “Variable Autonomy Control System” that I downloaded from the USPTO Trademark Document Retrieval (TDR) Web site at http://tmportal.uspto.gov/external/portal/tow on or about November 17, 2010.

 

9.  Exhibit 7 is a true and accurate reproduction of documents filed by Geneva Aerospace in Trademark Application, Serial Number 78355939 for “VACS” that I downloaded from the USPTO Trademark Document Retrieval (TDR) Web site at http://tmportal.uspto.gov/external/portal/tow on or about November 17, 2010.

 

10.  Exhibit 8 is a true and accurate reproduction of the paper Development and Testing of a Variable Autonomy Control System (VACS) for UAVs by Dave Duggan of Geneva Aerospace and Luis A. Piñeiro of AFRL contained in the Proceedings AUVSI Symposium, 2002, that was given to me by AUVSI (Association of Unmanned Vehicles International) on November 18, 2010.

 

11.  Exhibit 9 is a true and accurate reproduction of the document contained in Geneva Aerospace Provisional Application 60/480,192 Small Business Innovation Research (SBIR) Program Projects Summary, Topic Number AF98-179, that I downloaded from PAIR on November 22, 2010.

 

12.  Exhibit 10 is a true and accurate reproduction of the Web page containing Geneva Phase I Contract information for AF98-179 that I downloaded from the Air Force SBIR Web site at

http://www.afsbirsttr.com/TechMall/Default.aspx?kwa=AF98-179 on November 26, 2010.

 

13.  Exhibit 11 is a true and accurate reproduction of the Information Disclosure Statement in the Duggan Provisional Application 60/480,192 that I downloaded from PAIR on November 22, 2010.

 

14.  November 22, 2010 was the first day that Provisional Application 60/480,192 became available to the public on PAIR. Provisional Application 60/480,192 became available to the public on PAIR only as a result of my telephone conversations with Mr. Don Levin (Director of SEARCH AND INFORMATION RESOURCES ADMINISTRATION) and Mr. Richard Fernandez (of that same office) the previous week.   

 

 

I hereby declare under the penalty of perjury that the foregoing is true and correct to the best of my knowledge and belief.

 

Dated: ________________________                                    ____________________________

                                                Jed Margolin