{This is an html version of my Appeal Brief. It does not have line numbers, or even page numbers. It does not include the Appendix. The PDF version (including the Appendix) is available at my blog for the issue. For that Click here.  JM }

 

IN THE UNITED STATES PATENT AND TRADEMARK OFFICE

 

Application Serial No. 11/736,356

 

Filed: 04/17/2007                                                                  

 

For: SYSTEM AND METHOD FOR SAFELY FLYING UNMANNED AERIAL VEHICLES IN CIVILIAN AIRSPACE

 

Examiner: Ronnie M. Mancho                                   Art Unit: 3664

 

In re Application of:  Jed Margolin        

 

 

Mail Stop Appeal Brief - Patents

Commissioner for Patents

P.O. Box 1450

Alexandria, VA 22313-1450

 

Sir,

Appeal Brief

 

            This is an appeal of the Rejection dated February 15, 2011 of twice-rejected claims 1-14. A Notice of Appeal was timely filed April 17, 2011. This Appeal Brief is timely filed within two months of that date. Pro se Appellant (“Margolin”) claims Small Entity Status. The filing fee of $270 is being paid through the USPTO’s Electronic Filing System.       


Table of Contents

 

This brief contains items under the following headings as required by 37 C.F.R. § 41.37

and M.P.E.P. § 1206:

 

I.          Real Party In Interest  ………………………………………………………….    4

II.         Related Appeals and Interferences …………………………………………….    4

III.       Status of Claims  ………………………………………………………………     4

IV.       Status of Amendments  ………………………………………………………..     4

V.        Summary of Claimed Subject Matter  …………………………………………    5

VI.       Grounds of Rejection to be Reviewed on Appeal  …………………………….  13

VII.      Argument  ……………………………………………………………………… 14

VIII.     Claims Appendix  ……………………………………………………………… 52

IX.       Evidence Appendix  …………………………………………………………… 52

Exhibit 1       Patent Application as filed  …………………………………..... 61

Exhibit 2       U.S. Patent 5,904,724 (Margolin)...………………………....…  87

Exhibit 3       First Office Action on the Merits  ……………………….…… 102

Exhibit 4       U. S. Patent Application 20050004723 (Duggan) ……………. 115

Exhibit 5       Applicant’s Response to First Office Action  …………….….. 193

Exhibit 6       Second Office Action  ………………………………………... 435

Exhibit 7       Applicant’s Summary of Telephone Interview with

                      Examiner  …………………………………………………….. 452

Exhibit 8       Applicant’s Summary of Telephone Interview with

                      Examiner’s SPE  ……………………………………………... 457

Exhibit 9       IDS References Considered by Examiner  ..……………….… 461

Exhibit 10     Sensing Requirements for Unmanned Air Vehicles,

                      AFRL Air Vehicles Directorate  …………………………….. 465

Exhibit 11     Developing Sense and Avoid Requirements for Meeting

                      An Equivalent Level of Safety, Russel Wolfe  ..…………... 469

      


        Exhibit 12    Article - Lockheed's Polecat UCAV Demonstrator Crashes,

                             Aviation Week & Space Technology, by Amy Butler, 03/19/2007,

                             page 44 ……………………………………………………........ 489

 

Exhibit 13     Ex parte MAURICE GIVENS Appeal 2009-003414

                      BPAI Informative Decision, Decided: August 6, 2009  …..…... 493

 

Exhibit 14     Speech - "Safety Must Come First"; J. Randolph Babbitt,

                      FAA Administrator; November 18, 2009,  FAA Web site  .…... 498

 

Exhibit 15     Article - Pentagon Accident Reports Suggest Military's

                             Drone Aircraft Plagued With Problems, by David Zucchino, from

                             The Ledger.com, July 6, 2010.

                             http://www.theledger.com/article/20100706/NEWS/7065101  .. 502

 

 

X.        Related Proceedings Appendix …………………………………………….….. 506

 


I.   REAL PARTY IN INTEREST

 

            The real party in interest for this appeal is the pro se appellant:

 

Jed Margolin

1981 Empire Rd.

Reno, NV  89521-7430

 

 

II.   RELATED APPEALS, INTERFERENCES, AND JUDICIAL PROCEEDINGS

 

            There are no other appeals, interferences, or judicial proceedings which will directly affect or be directly affected by or have a bearing on the Board’s decision in this appeal.

 

 

III.   STATUS OF CLAIMS

 

            The Application as filed included claims 1-14.

 

            Claims 1-14 have been twice-rejected in the Office Action of February 15, 2011. Claims 1-14 are being appealed.

 

 

IV.   STATUS OF AMENDMENTS

 

            In response to the Final Office Action of February 15, 2011, a Notice of Appeal was

filed on April 17, 2011. No formal amendments were filed either before or after the issuance of the Final Office Action of February 15, 2011.

 


V.   SUMMARY OF CLAIMED SUBJECT MATTER

 

Margolin’s current invention is a system and method for safely flying an unmanned aerial vehicle (UAV), unmanned combat aerial vehicle (UCAV), or remotely piloted vehicle (RPV) in civilian airspace by using a remotely located pilot to control the aircraft using a synthetic vision system during at least selected phases of the flight such as during take-offs and landings. The current invention is a new and unobvious use for U.S. Patent 5,904,724 Method and apparatus for remotely piloting an aircraft issued May 18, 1999 to Margolin. Appellant Margolin is the same Margolin named as the inventor in 5,904,724 (‘724) which was incorporated by reference in the present application. (See Application Spec. page 2, lines 6 -19) The current application solves a long unmet need, namely the ability to safely fly unmanned aerial vehicles in civilian airspace.

 

Independent Claim 1

References

 

1.   A system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

 

(a)  a ground station equipped with a synthetic vision system;

 

 

(b)  an unmanned aerial vehicle capable of supporting said synthetic vision system;

 

 

(c)  a remote pilot operating said ground station;

 

 

(d)  a communications link between said unmanned aerial vehicle and said ground station;

 

(e)  a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot;

 

 

 

whereas said remote pilot uses said synthetic vision system to control said unmanned aerial vehicle during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system.

 

 

 

 

 

Spec. page 1, line 19 - page 2, line 19;

‘724 Spec. Column 3, lines 28-49;

‘724 Figures 4 and 5.

 

Spec. page 1, line 19 - page 2, line 19;

‘724 Spec. Column 4, lines 1-16;

‘724 Figure 3.

 

Spec. page 2, lines 6-19;

‘724 Figure 1 #102.

 

‘724 Column 3, lines 59-67; 

‘724 Figure 1 #104, 105, 106.

 

Spec. page 5, lines 20-21;

Spec. page 15, lines 23-27;

‘724 Column 4, line 66 - Column 5,

         line 5;

‘724 Figure 3 #307.

 

 

Spec. page 4, line 32 - page 5, line 3;

Spec. page 5, lines 13-15.

 

 

Dependent Claim 2

References

 

2.   The system of claim 1 whereby said selected phases of the flight of said unmanned aerial vehicle comprise:

 

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

 

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

 

 

 

Spec. page 5, lines 5-7;

Figures 1 and 2.

 

 

Spec. page 5, lines 8-9;

Figures 1 and 2.

 

 

Dependent Claim 3

References

 

3.   The system of claim 1 further comprising a system onboard said unmanned aerial vehicle for periodically transmitting the identification, location, altitude, and bearing of said unmanned aerial vehicle.

 

 

Spec. page 5, lines 17-19.

 

Dependent Claim 4

References

 

4.   The system of claim 1 further comprising a system onboard said unmanned aerial vehicle for providing a communications channel for Air Traffic Control and the pilots of other aircraft to communicate directly with said remote pilot.

 

Spec. page 5, lines 22-23;

Spec. page 16, lines 1-4.

 


Independent Claim 5

References

 

5.   A system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

 

(a)  a ground station equipped with a synthetic vision system;

 

 

(b)  an unmanned aerial vehicle capable of supporting said synthetic vision system;

 

 

(c)  a remote pilot operating said ground station;

 

 

(d)  a communications link between said unmanned aerial vehicle and said ground station;

 

(e)  a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot;

 

 

whereas said remote pilot uses said synthetic vision system to control said unmanned aerial vehicle during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system, and

 

whereas the selected phases of the flight of said unmanned aerial vehicle comprise:

 

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

 

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

 

 

 

Spec. page 1, line 19 - page 2, line 19;

‘724 Spec. Column 3, lines 28-49;

‘724 Figures 4 and 5.

 

Spec. page 1, lines 19 - page 2, line 19;

‘724 Spec. Column 4, lines 1-16;

‘724 Figure 3.

 

Spec. page 2, lines 6-19;

 ‘724 Figure 1 #102.

 

‘724 Column 3, lines 59-67; 

‘724 Figure 1 #104, 105, 106;

 

Spec. page 5, lines 20-21;

Spec. page 15, lines 23-27;

‘724 Column 4, line 66 - Column 5,

         line 5;

‘724 Figure 3 #307.

 

Spec. page 4, line 32 - page 5, line 3;

Spec. page 5, lines 13-15.

 

 

 

 

 

 

 

 

 

Spec. page 5, lines 5-7;

Figures 1 and 2.

 

 

Spec. page 5, lines 8-9;

Figures 1 and 2.

 

 

 

Dependent Claim 6

References

 

6.   The system of claim 5 further comprising a system onboard said unmanned aerial vehicle for periodically transmitting the identification, location, altitude, and bearing of said unmanned aerial vehicle.

 

Spec. page 5, lines 17-19.

 

Dependent Claim 7

Reference

 

7.   The system of claim 5 further comprising a system onboard said unmanned aerial vehicle for providing a communications channel for Air Traffic Control and the pilots of other aircraft to communicate directly with said remote pilot.

 

Spec. page 5, lines 22-23;

Spec. page 16, lines 1-4.

 

Independent Claim 8

References

 

8.   A method for safely flying an unmanned aerial vehicle as part of a unmanned aerial system equipped with a synthetic vision system in civilian airspace comprising the steps of:

 

(a)  using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

 

(b)  providing a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot.

 

 

 

 

 

 

Spec. page 1, lines 19 - page 2, line 19;

Spec. page 4, lines 32-34;

Spec. page 5, lines 1-3;

Spec. page 5, lines 13-15.

 

 

 

 

 

Spec. page 5, lines 20-21;

Spec. page 15, lines 23-27;

‘724 Column 4, line 66 - Column 5,

         line 5;

‘724 Figure 3 #307.

 

 

Dependent Claim 9

References

 

9.  The method of claim 8 whereby said selected phases of the flight of said unmanned aerial vehicle comprise:

 

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

 

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

 

 

 

Spec. page 5, lines 5-7;

Figures 1 and 2.

 

 

Spec. page 5, lines 8-9;

Figures 1 and 2.

 

 

Dependent Claim 10

References

 

10.   The method of claim 8 further comprising the step of providing a system onboard said unmanned aerial vehicle for periodically transmitting the identification, location, altitude, and bearing of said unmanned aerial vehicle.

 

 

Spec. page 5, lines 17-19.

 


 

Dependent Claim 11

References

 

11.   The method of claim 8 further comprising the step of providing a system onboard said unmanned aerial vehicle for providing a communications channel for Air Traffic Control and the pilots of other aircraft to communicate directly with said remote pilot.

 

Spec. page 5, lines 22-23;

Spec. page 16, lines 1-4.

 

Independent Claim 12

References

 

12.   A method for safely flying an unmanned aerial vehicle as part of a unmanned aerial system equipped with a synthetic vision system in civilian airspace comprising the steps of:

 

(a)  using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

 

(b)  providing a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot;

 

 

whereas said selected phases of the flight of said unmanned aerial vehicle comprise:

 

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

 

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

 

 

 

 

 

Spec. page 1, lines 19 - page 2, line 19;

Spec. page 4, lines 32-34;

Spec. page 5, lines 1-3;

Spec. page 5, lines 13-15.

 

 

 

 

 

Spec. page 5, lines 20-21;

Spec. page 15, lines 23-27;

‘724 Column 4, line 66 - Column 5,

         line 5;

‘724 Figure 3 #307.

 

 

 

 

Spec. page 5, lines 5-7;

Figures 1 and 2.

 

 

Spec. page 5, lines 8-9;

Figures 1 and 2.

 

 

Dependent Claim 13

References

 

13.   The method of claim 12 further comprising the step of providing a system onboard said unmanned aerial vehicle for periodically transmitting the identification, location, altitude, and bearing of said unmanned aerial vehicle.

 

Spec. page 5, lines 17-19.

 

Dependent Claim 14

References

 

14.   The method of claim 12 further comprising the step of providing a system onboard said unmanned aerial vehicle for providing a communications channel for Air Traffic Control and the pilots of other aircraft to communicate directly with said remote pilot.

 

Spec. page 5, lines 22-23;

Spec. page 16, lines 1-4.

 


VI.       GROUNDS OF REJECTION TO BE REVIEWED ON APPEAL

 

A.   Claims 1-14 stand rejected under 35 U.S.C § 103(a) as being unpatentable over U.S. Patent  5,904,724 (‘724) to Margolin (the same Margolin as the Appellant) in view of Patent Publication US 2005004723 to Duggan.

 

B.  Whether Margolin had a duty to define the term “civilian airspace” or whether he was entitled to use the common meaning of the term.

 

C.   Whether Margolin had a duty to define “safety” or whether he was entitled to use the common meaning of the term; and whether Margolin defined a particular level of safety.

 

D.  Whether the Examiner’s assertion that “It is believed that the aircraft flown in the prior art is flown safely …” (and which is asserted without evidence) is proper.

 


VII.   ARGUMENT

 

Ground A

 

Claims 1-14 stand rejected under 35 U.S.C § 103(a) as being unpatentable over U.S. Patent  5,904,724 (‘724) to Margolin (the same Margolin as the Appellant) in view of Patent Publication US 2005004723 to Duggan.

 

The following is a quotation of 35 U.S.C § 103(a):

(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made.

MPEP § 2142 states under the heading ESTABLISHING A PRIMA FACIE CASE OF OBVIOUSNESS:

a.   **>The key to supporting any rejection under 35 U.S.C. 103 is the clear articulation of the reason(s) why the claimed invention would have been obvious. The Supreme Court in KSR International Co. v. Teleflex Inc., 550 U.S. ___, ___, 82 USPQ2d 1385, 1396 (2007) noted that the analysis supporting a rejection under 35 U.S.C. 103 should be made explicit. The Federal Circuit has stated that "rejections on obviousness cannot be sustained with mere conclusory statements; instead, there must be some articulated reasoning with some rational underpinning to support the legal conclusion of obviousness." In re Kahn, 441 F.3d 977, 988, 78 USPQ2d 1329, 1336 (Fed. Cir. 2006). See also KSR, 550 U.S. at ___ , 82 USPQ2d at 1396 (quoting Federal Circuit statement with approval). <

 

{Emphasis added}

 

In the Examiner’s 35 U.S.C. § 103(a) rejection he failed to make a prima facie case of obviousness.

 

Margolin’s current invention is a system and method for safely flying an unmanned aerial vehicle (UAV), unmanned combat aerial vehicle (UCAV), or remotely piloted vehicle (RPV) in civilian airspace by using a remotely located pilot to control the aircraft using a synthetic vision system during at least selected phases of the flight such as during take-offs and landings.

 

The current invention is a new and unobvious use for U.S. Patent 5,904,724 Method and apparatus for remotely piloting an aircraft issued May 18, 1999 to Margolin. Applicant/Appellant Margolin is the same Margolin named as the inventor in 5,904,724 (‘724) which was incorporated by reference in the present application. From Application Spec. page 2, lines 6 -19:

[003]   The use of Synthetic Vision in flying a UAV is taught by U.S. Patent 5,904,724    Method and apparatus for remotely piloting an aircraft issued May 18, 1999 to Margolin (the present Applicant) which is hereby incorporated by reference.[1]

 

Claim 1 (Independent)

 

In claim 1, the new and unobvious use for ‘724 is in using synthetic vision during selected phases of the flight and during those phases of the flight where synthetic vision is not used, an autonomous control system is used. In claim 1 this element is:

whereas said remote pilot uses said synthetic vision system to control said unmanned aerial vehicle during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system.

 

The Examiner asserts that he found this element in ‘724 as follows, from Office Action dated September 1, 2010, page 3, second paragraph (Evidence Appendix Exhibit 3 at 105) and Office Action dated February 15, 2011, page 3, second paragraph (Evidence Appendix Exhibit 6 at 438):

whereas said remote pilot uses said synthetic vision system (305, 306, 307, 311 on aircraft) to control said unmanned aerial vehicle 300 during at least selected phases of the flight of said unmanned aerial vehicle.


He also finds it in Duggan, in Office Action dated September 1, 2010, page 3 (Evidence Appendix Exhibit 3 at 105) and Office Action dated February 15, 2011, page 3 (Evidence Appendix Exhibit 6 at 438):

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

 

a ground station controlling an unmanned aerial vehicle (sec. 0352, 00353), wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318, 0322, 0353) when a synthetic vision (sec. 0356, 0365, 0388, 0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system (autopilot, sec 0346 to 0350, 0390-0329).

 

The Examiner’s references to ’724 are references in the figures, namely Figure 3. ’724 Figure 3 is reproduced here:

00000000

The Examiner’s assertion that this shows “whereas said remote pilot uses said synthetic vision system (305, 306, 307, 311 on aircraft) to control said unmanned aerial vehicle 300 during at least selected phases of the flight of said unmanned aerial vehicle” goes beyond a broadest reasonable interpretation. It goes beyond even a broadest possible interpretation.

 

The same is true of the Duggan references cited by the Examiner:

a ground station controlling an unmanned aerial vehicle (sec. 0352, 00353), wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318, 0322, 0353) when a synthetic vision (sec. 0356, 0365, 0388, 0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system (autopilot, sec 0346 to 0350, 0390-0329).

 

Duggan:

[0352]   In one aspect of the present invention, an operator station (also referred to as the ground control station or GCS) is designed to accommodate command and control of multiple vehicles or a single vehicle by a single operator. In accordance with one embodiment, the ground control station is platform independent and implements an application program interface that provides windowing and communications interfaces (e.g., the platform is implemented in Open Source wxWindows API). The underlying operating system is illustratively masked and enables a developer to code in a high level environment.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0318] If the pilot chooses a surveillance location outside the total FOV, then the outer loop guidance will illustratively follow a command-to-LOS mode guide law until the UAV flight path points toward the target. Once the desired staring-point comes within a minimum range threshold, the guidance automatically trips into a loiter pattern (either constant-radius or elliptical) to maintain a station with a single key-click while he/she conducts other activities. FIGS. 22A & 22B together demonstrate the surveillance-point approach scenario.

 

[0322] In accordance with one aspect of the present invention, sensor-slave mode commands are generated by an autonomous line-of-sight driven function, in which the command objectives are generated by the necessities of the function rather than by an operator. For example, a function designed to command a raster-scan of a particular surveillance area, or a function designed to scan a long a roadway could be used to generate sensor slave commands. Another example is a function designed to generate line-of-sight commands for UAV-to-UAV rendezvous formation flying.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0356] a synthetic vision display

 

[0365] The two video monitors are illustratively used to display real-time data linked camera imagery from two air vehicles having cameras (of course, fewer, more or none of the vehicles might have cameras and the number of monitor displays can be altered accordingly). In accordance with one embodiment, camera imagery is recorded on videotapes during a mission. In accordance with one embodiment, the two repeater displays are used to provide redundant views of the GUI and synthetic vision display. The laptop illustratively serves as a GUI backup in the event that the main GUI fails.

 

[0388] In one aspect of the present invention, synthetic vision display technical approach of the present invention is based upon integrating advanced simulated visuals, originally developed for training purposes, into UAV operational systems. In accordance with one embodiment, the simulated visuals are integrated with data derived from the ground control station during flight to enable real-time synthetic visuals.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0346] In accordance with one embodiment, an exemplary translation layer implementation will now be provided. After the guidance algorithms execute, the outputs are translated to the native vehicle autopilot commands. The equations below provide example kinematic translations from the guidance acceleration commands to native vehicle autopilot commands. These equations demonstrate the principal that vehicle motion is activated through acceleration. The methods that various vehicles employ to generate acceleration are numerous (bank angle autopilot, acceleration autopilot, heading control autopilot, altitude control autopilot, etc). Since the control algorithms described herein generate acceleration commands that can be kinematically translated into any of these native autopilot commands, the guidance algorithms truly provide a generalized library of control laws that can control any vehicle through that vehicle's native atomic functions. Ubiquitous acceleration control techniques enable VACS to synthesize control commands for any vehicle, including air, ground, or sea-based. 35 a v = vertical plane acceleration command a h = horizontal plane acceleration command = tan - 1 ( a h a v ) = bank angle command a T = a v 2 + a h 2 = total body acceleration command . = a h V = turn rate command i = i - 1 + . t = heading command . = ( a v - g ) V = flight path rate command i = i - 1 + . t = flight path angle command h . = V sin ( ) = climb rate command h i = h i = 1 + h . t = altitude command Eq . 57

 

[0347] Additional functionality that can be enabled in a translation layer is means for discouraging or preventing an operator (e.g., the human or non-human operator interfacing the VACS architecture) from overdriving, stalling, or spinning the vehicle frame. This being said, limiting algorithms can also be employed in the guidance or autopilot functions.

[0348] X. Autopilot

[0349] As has been addressed, the present invention is not limited to, and does not require, a particular autopilot system. The control system and architecture embodiments of the present invention can be adapted to accommodate virtually any autopilot system.

[0350] For the purpose of providing an example, an illustrative suitable autopilot software system will now be described. The illustrative autopilot system incorporates a three-axis design (pitch and yaw with an attitude control loop in the roll axis) for vehicle stabilization and guidance command tracking. The autopilot software design incorporates flight control techniques, which allow vehicle control algorithms to dynamically adjust airframe stabilization parameters in real-time during flight. The flight computer is programmed directly with the airframe physical properties, so that it can automatically adjust its settings with changes in airframe configuration, aerodynamic properties, and/or flight state. This provides for a simple and versatile design, and possesses the critical flexibility needed when adjustments to the airframe configuration become necessary. The three-loop design includes angular rate feedback for stability augmentation, attitude feedback for closed-loop stiffness, and acceleration feedback for command tracking. In addition, an integral controller in the forward loop illustratively provides enhanced command tracking, low frequency disturbance rejection and an automatic trim capability.

 

The Examiner then refers to the range 0390-0329. In Margolin’s Response to the First Office Action of September 1, 2010 he pointed out that this range did not make sense. From Evidence Appendix Exhibit 5 at 205:

{The Examiner may have meant 0390-0392. Otherwise the range is not credible}

 

Margolin assumed (and still assumes) that the Examiner meant 0390-0392.

 

And yet, in the Second Office Action (February 15, 2011), the Examiner makes the same mistake. See Evidence Appendix Exhibit 6 at 438. This calls into question the Examiner’s statement that “Applicant’s arguments filed 11/29/10 have been fully considered but they are not persuasive.” (See Evidence Appendix Exhibit 6 at 445.) The real reason that the Examiner did not find Margolin’s arguments persuasive is because he did not read them. He also did not read the Specification in the Application or he would have known that Applicant (and now Appellant) Margolin is also the Margolin in ‘724.

 

Here is Duggan 0390-0392:

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0391] The described Intelligent displays with smart variables represent an effective approach to actively displaying information for different types of vehicles. However, a problem can arise when a new vehicle is integrated into the ground control station with a completely foreign command and control interface. Under these circumstances, the ground control station is not concerned about displaying data, but is tasked to provide a command and control interface for the operator to perform the required operations. This conundrum is the motivation for another embodiment of the present invention, namely, the integration of vehicle specific panels in the ground control station.

 

[0392] In one embodiment, a generic vehicle class (GVC) is illustratively a software component that provides a rapid development environment API to add new vehicle classes and types to the ground control station. The GVC also illustratively serves as a software construct that allows the inclusion of multiple vehicles within the ground control station framework. One of the variables in the application is a vector of pointers to a generic vehicle class. This list is constructed by allocating new specific vehicles and returning a type case to the base generic vehicle class. When a new vehicle is integrated into the ground control station, the generic vehicle class provides all of the virtual functions to integrate with system control components (e.g., to integrate with a map display, a communications package, PCIG imagery and/or appropriate display windows). An important object in the application framework is illustratively a pointer to the current vehicle generic class. When the user switches vehicles, this pointer is updated and all displays grab the appropriate smart variables from the pointer to the new base class. This is the mechanism by which windows immediately update to the current vehicle information whenever the user switches vehicles. The default windows use the pointer to the current vehicle to grab information. In this manner, if the user switches to a new vehicle with a different set of datalink variables, that fact is immediately apparent on the display windows.

 

Not only do the Duggan citations fail to support a broadest reasonable interpretation (or even a broadest possible interpretation) for the Examiner’s assertion, they amount to a series of non sequiturs.

 

They certainly fail to make a prima facie case for rejection.

 

In addition, although the Examiner’s rejection of claim 1 in both the Office Action of September 1, 2010 (Evidence Appendix Exhibit 3 at 102) and in February 15, 2011 (Evidence Appendix Exhibit 6 at 435) are almost identical, the Examiner added some language to the February 15, 2011 rejection.


 

The September 1, 2010 rejection, page 3; Evidence Appendix Exhibit 3 at 105:

 

February 15, 2011 rejection, page 3;

Evidence Appendix Exhibit 6 at 438:

 

whereas said remote pilot uses said synthetic vision system (305, 306, 307, 311 on aircraft) to control said unmanned aerial vehicle 300 during at least selected phases of the flight of said unmanned aerial vehicle.

 

Margolin did not disclose that the vehicle is flown using an autonomous control system. However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

 

 

whereas said remote pilot uses said synthetic vision system (305, 306, 307, 311 on aircraft; col. 5, lines 50-60) to control said unmanned aerial vehicle 300 during at least selected phases of the flight of said unmanned aerial vehicle (selected phases implies some or all phases during flight).

 

Margolin did not disclose that the vehicle is flown using an autonomous control system (e.g. autopilot). However, Duggan teach of a system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

 

 

The added language (selected phases implies some or all phases during flight) might be a benign addition but probably isn’t. Otherwise the Examiner would not have added it. Margolin intended that the phases be selected. The phrases “some or all phases” is broader and includes “all phases” which is clearly not Margolin’s intent.

 

The added language (e.g. autopilot) is definitely not benign. An autonomous control system is much more than an autopilot. Margolin does not equate the two.

 

By making the second rejection final the Examiner has denied Margolin the opportunity to respond to these additions to the second rejection.

 

 

Claim 2 (Dependent)

 

Claim 2 is a dependent claim, dependent on Claim 1. Margolin has shown that Claim 1 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 2 is non-obvious.

2143.03 All Claim Limitations Must Be **>Considered< [R-6]

** "All words in a claim must be considered in judging the patentability of that claim against the prior art." In re Wilson, 424 F.2d 1382, 1385, 165 USPQ 494, 496 (CCPA 1970). If an independent claim is nonobvious under 35 U.S.C. 103, then any claim depending therefrom is nonobvious. In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988).

 

Claim 3 (Dependent)

 

Claim 3 is a dependent claim, dependent on Claim 1. Margolin has shown that Claim 1 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 3 is non-obvious.

 

Claim 4 (Dependent)

 

Claim 4 is a dependent claim, dependent on Claim 1. Margolin has shown that Claim 1 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 4 is non-obvious.

 

Claim 5 (Independent)

 

In claim 5, the new and unobvious use for ‘724 is in using synthetic vision during selected phases of the flight and during those phases of the flight where synthetic vision is not used, an autonomous control system is used, and further, that the selected phases comprise (a) when the unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude, and (b) when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

In claim 5 this element is:

whereas said remote pilot uses said synthetic vision system to control said unmanned aerial vehicle during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system, and

 

whereas the selected phases of the flight of said unmanned aerial vehicle comprise:

 

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

 

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

However, whereas in the Examiner’s rejection of claim 1 he made at least some attempt to indentify the different elements in ‘724, in his rejection of claim 5 he simply cited the following: abstract; figs. 1-7; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67. See Office Action dated September 1, 2010, page 4, last paragraph (Evidence Appendix Exhibit 3 at 106) and Office Action dated February 15, 2011, page 4, last paragraph (Evidence Appendix Exhibit 6 at 439).

 

The three passages cited in ‘724 (Column 3, lines 8-67; Column 4, lines 1-67; and Column 5, lines 1-67) form a continuous passage from Column 3, line 8 to Column 5, line 67. This passage of approximately 1619 words forms the core of the ‘724 DETAILED DESCRIPTION. The Examiner also cited all of the drawings and the abstract.

 

Breaking the long contiguous passage of approximately 1619 words into three sections is misleading. By doing this the Examiner shows awareness of his failure to make a prima facie case for rejection. Or, perhaps it was simply laziness.

 

The Examiner did cite Duggan in one of the elements, but only one:

a ground station controlling an unmanned aerial vehicle (sec. 0352, 00353), wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318, 0322, 0353) when a synthetic vision (sec. 0356, 0365, 0388, 0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system (autopilot, sec 0346 to 0350, 0390-0329).

 

As with claim 1, the Duggan references are irrelevant. And again, the Examiner repeats the mistake of referring to 0390-0329.

Duggan:

[0352]   In one aspect of the present invention, an operator station (also referred to as the ground control station or GCS) is designed to accommodate command and control of multiple vehicles or a single vehicle by a single operator. In accordance with one embodiment, the ground control station is platform independent and implements an application program interface that provides windowing and communications interfaces (e.g., the platform is implemented in Open Source wxWindows API). The underlying operating system is illustratively masked and enables a developer to code in a high level environment.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0318] If the pilot chooses a surveillance location outside the total FOV, then the outer loop guidance will illustratively follow a command-to-LOS mode guide law until the UAV flight path points toward the target. Once the desired staring-point comes within a minimum range threshold, the guidance automatically trips into a loiter pattern (either constant-radius or elliptical) to maintain a station with a single key-click while he/she conducts other activities. FIGS. 22A & 22B together demonstrate the surveillance-point approach scenario.

 

[0322] In accordance with one aspect of the present invention, sensor-slave mode commands are generated by an autonomous line-of-sight driven function, in which the command objectives are generated by the necessities of the function rather than by an operator. For example, a function designed to command a raster-scan of a particular surveillance area, or a function designed to scan a long a roadway could be used to generate sensor slave commands. Another example is a function designed to generate line-of-sight commands for UAV-to-UAV rendezvous formation flying.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0356] a synthetic vision display

 

[0365] The two video monitors are illustratively used to display real-time data linked camera imagery from two air vehicles having cameras (of course, fewer, more or none of the vehicles might have cameras and the number of monitor displays can be altered accordingly). In accordance with one embodiment, camera imagery is recorded on videotapes during a mission. In accordance with one embodiment, the two repeater displays are used to provide redundant views of the GUI and synthetic vision display. The laptop illustratively serves as a GUI backup in the event that the main GUI fails.

 

[0388] In one aspect of the present invention, synthetic vision display technical approach of the present invention is based upon integrating advanced simulated visuals, originally developed for training purposes, into UAV operational systems. In accordance with one embodiment, the simulated visuals are integrated with data derived from the ground control station during flight to enable real-time synthetic visuals.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0346] In accordance with one embodiment, an exemplary translation layer implementation will now be provided. After the guidance algorithms execute, the outputs are translated to the native vehicle autopilot commands. The equations below provide example kinematic translations from the guidance acceleration commands to native vehicle autopilot commands. These equations demonstrate the principal that vehicle motion is activated through acceleration. The methods that various vehicles employ to generate acceleration are numerous (bank angle autopilot, acceleration autopilot, heading control autopilot, altitude control autopilot, etc). Since the control algorithms described herein generate acceleration commands that can be kinematically translated into any of these native autopilot commands, the guidance algorithms truly provide a generalized library of control laws that can control any vehicle through that vehicle's native atomic functions. Ubiquitous acceleration control techniques enable VACS to synthesize control commands for any vehicle, including air, ground, or sea-based. 35 a v = vertical plane acceleration command a h = horizontal plane acceleration command = tan - 1 ( a h a v ) = bank angle command a T = a v 2 + a h 2 = total body acceleration command . = a h V = turn rate command i = i - 1 + . t = heading command . = ( a v - g ) V = flight path rate command i = i - 1 + . t = flight path angle command h . = V sin ( ) = climb rate command h i = h i = 1 + h . t = altitude command Eq . 57

 

[0347] Additional functionality that can be enabled in a translation layer is means for discouraging or preventing an operator (e.g., the human or non-human operator interfacing the VACS architecture) from overdriving, stalling, or spinning the vehicle frame. This being said, limiting algorithms can also be employed in the guidance or autopilot functions.

[0348] X. Autopilot

[0349] As has been addressed, the present invention is not limited to, and does not require, a particular autopilot system. The control system and architecture embodiments of the present invention can be adapted to accommodate virtually any autopilot system.

[0350] For the purpose of providing an example, an illustrative suitable autopilot software system will now be described. The illustrative autopilot system incorporates a three-axis design (pitch and yaw with an attitude control loop in the roll axis) for vehicle stabilization and guidance command tracking. The autopilot software design incorporates flight control techniques, which allow vehicle control algorithms to dynamically adjust airframe stabilization parameters in real-time during flight. The flight computer is programmed directly with the airframe physical properties, so that it can automatically adjust its settings with changes in airframe configuration, aerodynamic properties, and/or flight state. This provides for a simple and versatile design, and possesses the critical flexibility needed when adjustments to the airframe configuration become necessary. The three-loop design includes angular rate feedback for stability augmentation, attitude feedback for closed-loop stiffness, and acceleration feedback for command tracking. In addition, an integral controller in the forward loop illustratively provides enhanced command tracking, low frequency disturbance rejection and an automatic trim capability.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0391] The described Intelligent displays with smart variables represent an effective approach to actively displaying information for different types of vehicles. However, a problem can arise when a new vehicle is integrated into the ground control station with a completely foreign command and control interface. Under these circumstances, the ground control station is not concerned about displaying data, but is tasked to provide a command and control interface for the operator to perform the required operations. This conundrum is the motivation for another embodiment of the present invention, namely, the integration of vehicle specific panels in the ground control station.

 

[0392] In one embodiment, a generic vehicle class (GVC) is illustratively a software component that provides a rapid development environment API to add new vehicle classes and types to the ground control station. The GVC also illustratively serves as a software construct that allows the inclusion of multiple vehicles within the ground control station framework. One of the variables in the application is a vector of pointers to a generic vehicle class. This list is constructed by allocating new specific vehicles and returning a type case to the base generic vehicle class. When a new vehicle is integrated into the ground control station, the generic vehicle class provides all of the virtual functions to integrate with system control components (e.g., to integrate with a map display, a communications package, PCIG imagery and/or appropriate display windows). An important object in the application framework is illustratively a pointer to the current vehicle generic class. When the user switches vehicles, this pointer is updated and all displays grab the appropriate smart variables from the pointer to the new base class. This is the mechanism by which windows immediately update to the current vehicle information whenever the user switches vehicles. The default windows use the pointer to the current vehicle to grab information. In this manner, if the user switches to a new vehicle with a different set of datalink variables, that fact is immediately apparent on the display windows.

 

The Examiner particularly failed to even make an attempt to point out the following limitation in claim 5:

whereas the selected phases of the flight of said unmanned aerial vehicle comprise:

 

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

 

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

Again, not only do the Duggan citations fail to support a broadest reasonable interpretation (or even a broadest possible interpretation) for the Examiner’s assertion, they amount to a series of non sequiturs.

 

Claim 6 (Dependent)

 

Claim 6 is a dependent claim, dependent on Claim 5. Margolin has shown that Claim 5 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 6 is non-obvious.

 

Claim 7 (Dependent)

 

Claim 7 is a dependent claim, dependent on Claim 5. Margolin has shown that Claim 5

is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 7 is non-obvious.

 

Claim 8 (Independent)

 

As with his rejection of independent claim 5 the Examiner simply cited the following in ‘724: abstract; figs. 1-7; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67. See Office Action dated September 1, 2010, pages 6,7 (Evidence Appendix Exhibit 3 at 108) and Office Action dated February 15, 2011, pages 6,7 (Evidence Appendix Exhibit 6 at 441). Then he asserted that he had found most of the elements contained therein.

 

The three passages cited in ‘724 (Column 3, lines 8-67; Column 4, lines 1-67; and Column 5, lines 1-67) form a continuous passage from Column 3, line 8 to Column 5, line 67. This passage of approximately 1619 words forms the core of the ‘724 DETAILED DESCRIPTION. The Examiner also cited all of the drawings and the abstract.

 

Breaking the long contiguous passage of approximately 1619 words into three sections is misleading. By doing this the Examiner shows awareness of his failure to make a prima facie case for rejection. Or, perhaps it was simply laziness.

 

The Examiner did cite Duggan in one of the elements, but only one:

a ground station controlling an unmanned aerial vehicle (sec. 0352, 00353), wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318, 0322, 0353) when a synthetic vision (sec. 0356, 0365, 0388, 0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system (autopilot, sec 0346 to 0350, 0390-0329).

 

As with claim 1 and claim 5, the Duggan references are irrelevant. And again, the Examiner repeats the mistake of referring to 0390-0329.

Duggan:

[0352]   In one aspect of the present invention, an operator station (also referred to as the ground control station or GCS) is designed to accommodate command and control of multiple vehicles or a single vehicle by a single operator. In accordance with one embodiment, the ground control station is platform independent and implements an application program interface that provides windowing and communications interfaces (e.g., the platform is implemented in Open Source wxWindows API). The underlying operating system is illustratively masked and enables a developer to code in a high level environment.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0318] If the pilot chooses a surveillance location outside the total FOV, then the outer loop guidance will illustratively follow a command-to-LOS mode guide law until the UAV flight path points toward the target. Once the desired staring-point comes within a minimum range threshold, the guidance automatically trips into a loiter pattern (either constant-radius or elliptical) to maintain a station with a single key-click while he/she conducts other activities. FIGS. 22A & 22B together demonstrate the surveillance-point approach scenario.

 

[0322] In accordance with one aspect of the present invention, sensor-slave mode commands are generated by an autonomous line-of-sight driven function, in which the command objectives are generated by the necessities of the function rather than by an operator. For example, a function designed to command a raster-scan of a particular surveillance area, or a function designed to scan a long a roadway could be used to generate sensor slave commands. Another example is a function designed to generate line-of-sight commands for UAV-to-UAV rendezvous formation flying.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0356] a synthetic vision display

 

[0365] The two video monitors are illustratively used to display real-time data linked camera imagery from two air vehicles having cameras (of course, fewer, more or none of the vehicles might have cameras and the number of monitor displays can be altered accordingly). In accordance with one embodiment, camera imagery is recorded on videotapes during a mission. In accordance with one embodiment, the two repeater displays are used to provide redundant views of the GUI and synthetic vision display. The laptop illustratively serves as a GUI backup in the event that the main GUI fails.

 

[0388] In one aspect of the present invention, synthetic vision display technical approach of the present invention is based upon integrating advanced simulated visuals, originally developed for training purposes, into UAV operational systems. In accordance with one embodiment, the simulated visuals are integrated with data derived from the ground control station during flight to enable real-time synthetic visuals.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0346] In accordance with one embodiment, an exemplary translation layer implementation will now be provided. After the guidance algorithms execute, the outputs are translated to the native vehicle autopilot commands. The equations below provide example kinematic translations from the guidance acceleration commands to native vehicle autopilot commands. These equations demonstrate the principal that vehicle motion is activated through acceleration. The methods that various vehicles employ to generate acceleration are numerous (bank angle autopilot, acceleration autopilot, heading control autopilot, altitude control autopilot, etc). Since the control algorithms described herein generate acceleration commands that can be kinematically translated into any of these native autopilot commands, the guidance algorithms truly provide a generalized library of control laws that can control any vehicle through that vehicle's native atomic functions. Ubiquitous acceleration control techniques enable VACS to synthesize control commands for any vehicle, including air, ground, or sea-based. 35 a v = vertical plane acceleration command a h = horizontal plane acceleration command = tan - 1 ( a h a v ) = bank angle command a T = a v 2 + a h 2 = total body acceleration command . = a h V = turn rate command i = i - 1 + . t = heading command . = ( a v - g ) V = flight path rate command i = i - 1 + . t = flight path angle command h . = V sin ( ) = climb rate command h i = h i = 1 + h . t = altitude command Eq . 57

 

[0347] Additional functionality that can be enabled in a translation layer is means for discouraging or preventing an operator (e.g., the human or non-human operator interfacing the VACS architecture) from overdriving, stalling, or spinning the vehicle frame. This being said, limiting algorithms can also be employed in the guidance or autopilot functions.

[0348] X. Autopilot

[0349] As has been addressed, the present invention is not limited to, and does not require, a particular autopilot system. The control system and architecture embodiments of the present invention can be adapted to accommodate virtually any autopilot system.

[0350] For the purpose of providing an example, an illustrative suitable autopilot software system will now be described. The illustrative autopilot system incorporates a three-axis design (pitch and yaw with an attitude control loop in the roll axis) for vehicle stabilization and guidance command tracking. The autopilot software design incorporates flight control techniques, which allow vehicle control algorithms to dynamically adjust airframe stabilization parameters in real-time during flight. The flight computer is programmed directly with the airframe physical properties, so that it can automatically adjust its settings with changes in airframe configuration, aerodynamic properties, and/or flight state. This provides for a simple and versatile design, and possesses the critical flexibility needed when adjustments to the airframe configuration become necessary. The three-loop design includes angular rate feedback for stability augmentation, attitude feedback for closed-loop stiffness, and acceleration feedback for command tracking. In addition, an integral controller in the forward loop illustratively provides enhanced command tracking, low frequency disturbance rejection and an automatic trim capability.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0391] The described Intelligent displays with smart variables represent an effective approach to actively displaying information for different types of vehicles. However, a problem can arise when a new vehicle is integrated into the ground control station with a completely foreign command and control interface. Under these circumstances, the ground control station is not concerned about displaying data, but is tasked to provide a command and control interface for the operator to perform the required operations. This conundrum is the motivation for another embodiment of the present invention, namely, the integration of vehicle specific panels in the ground control station.

 

[0392] In one embodiment, a generic vehicle class (GVC) is illustratively a software component that provides a rapid development environment API to add new vehicle classes and types to the ground control station. The GVC also illustratively serves as a software construct that allows the inclusion of multiple vehicles within the ground control station framework. One of the variables in the application is a vector of pointers to a generic vehicle class. This list is constructed by allocating new specific vehicles and returning a type case to the base generic vehicle class. When a new vehicle is integrated into the ground control station, the generic vehicle class provides all of the virtual functions to integrate with system control components (e.g., to integrate with a map display, a communications package, PCIG imagery and/or appropriate display windows). An important object in the application framework is illustratively a pointer to the current vehicle generic class. When the user switches vehicles, this pointer is updated and all displays grab the appropriate smart variables from the pointer to the new base class. This is the mechanism by which windows immediately update to the current vehicle information whenever the user switches vehicles. The default windows use the pointer to the current vehicle to grab information. In this manner, if the user switches to a new vehicle with a different set of datalink variables, that fact is immediately apparent on the display windows.

 

The Examiner particularly failed to even make an attempt to point out the following limitation in claim 8:

whereas the selected phases of the flight of said unmanned aerial vehicle comprise:

 

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

 

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

Again, not only do the Duggan citations fail to support a broadest reasonable interpretation (or even a broadest possible interpretation) for the Examiner’s assertion, they amount to a series of non sequiturs.

 

Claim 9 (Dependent)

 

Claim 9 is a dependent claim, dependent on Claim 8. Margolin has shown that Claim 8 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 9 is non-obvious.

 

Claim 10 (Dependent)

 

Claim 10 is a dependent claim, dependent on Claim 8. Margolin has shown that Claim 8 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 10 is non-obvious.

 

Claim 11 (Dependent)

 

Claim 11 is a dependent claim, dependent on Claim 8. Margolin has shown that Claim 8 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 11 is non-obvious.

 


Claim 12 (Independent)

 

As with the Examiner’s rejection of claim 5 and claim 8, in his rejection of claim 12 he simply cited the following from ‘724: abstract; figs. 1-7; col. 3, lines 8-67; col. 4, lines 1-67; col. 5, lines 1-67. See Office Action dated September 1, 2010, pages 8,9 (Evidence Appendix Exhibit 3 at 110) and Office Action dated February 15, 2011, pages 8,9 (Evidence Appendix Exhibit 6 at 443).

 

The three passages cited in ‘724 (Column 3, lines 8-67; Column 4, lines 1-67; and Column 5, lines 1-67) form a continuous passage from Column 3, line 8 to Column 5, line 67. This passage of approximately 1619 words forms the core of the ‘724 DETAILED DESCRIPTION. The Examiner also cited all of the drawings and the abstract.

 

Breaking the long contiguous passage of approximately 1619 words into three sections is misleading. By doing this the Examiner shows awareness of his failure to make a prima facie case for rejection. Or, perhaps it was simply laziness.

 

The Examiner did cite Duggan in one of the elements, but only one:

a ground station controlling an unmanned aerial vehicle (sec. 0352, 00353), wherein during phases of a flight of an unmanned aerial vehicle (UAV, sec 0318, 0322, 0353) when a synthetic vision (sec. 0356, 0365, 0388, 0390) is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system (autopilot, sec 0346 to 0350, 0390-0329).

 

As with the rejection of claim 1, claim 5, and claim 8 the Duggan references are irrelevant. And again, the Examiner repeats the mistake of referring to 0390-0329.

Duggan:

[0352]   In one aspect of the present invention, an operator station (also referred to as the ground control station or GCS) is designed to accommodate command and control of multiple vehicles or a single vehicle by a single operator. In accordance with one embodiment, the ground control station is platform independent and implements an application program interface that provides windowing and communications interfaces (e.g., the platform is implemented in Open Source wxWindows API). The underlying operating system is illustratively masked and enables a developer to code in a high level environment.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0318] If the pilot chooses a surveillance location outside the total FOV, then the outer loop guidance will illustratively follow a command-to-LOS mode guide law until the UAV flight path points toward the target. Once the desired staring-point comes within a minimum range threshold, the guidance automatically trips into a loiter pattern (either constant-radius or elliptical) to maintain a station with a single key-click while he/she conducts other activities. FIGS. 22A & 22B together demonstrate the surveillance-point approach scenario.

 

[0322] In accordance with one aspect of the present invention, sensor-slave mode commands are generated by an autonomous line-of-sight driven function, in which the command objectives are generated by the necessities of the function rather than by an operator. For example, a function designed to command a raster-scan of a particular surveillance area, or a function designed to scan a long a roadway could be used to generate sensor slave commands. Another example is a function designed to generate line-of-sight commands for UAV-to-UAV rendezvous formation flying.

 

[0353] In one embodiment, the ground control station incorporates several specialized user interface concepts designed to effectively support a single operator tasked to control multiple vehicles. The GCS also illustratively supports manual control and sensor steering modes. In the manual control mode, the operator can assume control authority of the vehicles individually from the ground control station at any time in flight. In the sensor steering mode, a vehicle will autonomously fly in the direction the operator is manually pointing the on-board imaging sensor (e.g., operator views video output from a digital camera on a TV interface, computer screen display, etc.). A custom data link is illustratively, utilized to support a two-way transfer of data between the ground control station and the UAV's. These design concepts together provide a flexible, multiple vehicle control system. The details of the concepts are discussed below.

 

[0356] a synthetic vision display

 

[0365] The two video monitors are illustratively used to display real-time data linked camera imagery from two air vehicles having cameras (of course, fewer, more or none of the vehicles might have cameras and the number of monitor displays can be altered accordingly). In accordance with one embodiment, camera imagery is recorded on videotapes during a mission. In accordance with one embodiment, the two repeater displays are used to provide redundant views of the GUI and synthetic vision display. The laptop illustratively serves as a GUI backup in the event that the main GUI fails.

 

[0388] In one aspect of the present invention, synthetic vision display technical approach of the present invention is based upon integrating advanced simulated visuals, originally developed for training purposes, into UAV operational systems. In accordance with one embodiment, the simulated visuals are integrated with data derived from the ground control station during flight to enable real-time synthetic visuals.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0346] In accordance with one embodiment, an exemplary translation layer implementation will now be provided. After the guidance algorithms execute, the outputs are translated to the native vehicle autopilot commands. The equations below provide example kinematic translations from the guidance acceleration commands to native vehicle autopilot commands. These equations demonstrate the principal that vehicle motion is activated through acceleration. The methods that various vehicles employ to generate acceleration are numerous (bank angle autopilot, acceleration autopilot, heading control autopilot, altitude control autopilot, etc). Since the control algorithms described herein generate acceleration commands that can be kinematically translated into any of these native autopilot commands, the guidance algorithms truly provide a generalized library of control laws that can control any vehicle through that vehicle's native atomic functions. Ubiquitous acceleration control techniques enable VACS to synthesize control commands for any vehicle, including air, ground, or sea-based. 35 a v = vertical plane acceleration command a h = horizontal plane acceleration command = tan - 1 ( a h a v ) = bank angle command a T = a v 2 + a h 2 = total body acceleration command . = a h V = turn rate command i = i - 1 + . t = heading command . = ( a v - g ) V = flight path rate command i = i - 1 + . t = flight path angle command h . = V sin ( ) = climb rate command h i = h i = 1 + h . t = altitude command Eq . 57

 

[0347] Additional functionality that can be enabled in a translation layer is means for discouraging or preventing an operator (e.g., the human or non-human operator interfacing the VACS architecture) from overdriving, stalling, or spinning the vehicle frame. This being said, limiting algorithms can also be employed in the guidance or autopilot functions.

[0348] X. Autopilot

[0349] As has been addressed, the present invention is not limited to, and does not require, a particular autopilot system. The control system and architecture embodiments of the present invention can be adapted to accommodate virtually any autopilot system.

[0350] For the purpose of providing an example, an illustrative suitable autopilot software system will now be described. The illustrative autopilot system incorporates a three-axis design (pitch and yaw with an attitude control loop in the roll axis) for vehicle stabilization and guidance command tracking. The autopilot software design incorporates flight control techniques, which allow vehicle control algorithms to dynamically adjust airframe stabilization parameters in real-time during flight. The flight computer is programmed directly with the airframe physical properties, so that it can automatically adjust its settings with changes in airframe configuration, aerodynamic properties, and/or flight state. This provides for a simple and versatile design, and possesses the critical flexibility needed when adjustments to the airframe configuration become necessary. The three-loop design includes angular rate feedback for stability augmentation, attitude feedback for closed-loop stiffness, and acceleration feedback for command tracking. In addition, an integral controller in the forward loop illustratively provides enhanced command tracking, low frequency disturbance rejection and an automatic trim capability.

 

[0390] In one aspect of the present invention, through GUI display 2622, an operator can maintain a variable level of control over a UAV, from fully manual to fully autonomous, with simple user-friendly inputs. For example, if an operator decides to divert a UAV to a new route, the operator has a plurality of options to select from. The following are examples of some of the options that an operator has. Those skilled in the art should recognize that this is not an exhaustive list. In one embodiment, the operator could graphically edit the existing route on mission situation display 2629 by adding a waypoint or orbit pattern in the vicinity of a desired target region. Prior to accepting the edited route, the control system evaluates the revised route against the vehicle performance capability as well as terrain obstructions. If the route is within acceptable bounds, the control system registers the modified route and maneuvers the vehicle accordingly. In another embodiment, the operator could select a park mode on selections pane 2630. After selected, the control system queues the operator to click the location of and graphical size (via a mouse) the desired orbit pattern in which the vehicle will fly while "parked" over a desired target. In another embodiment, the operator can select a manual control mode on selections pane 2630. By selecting RDC (remote directional command), for example, the control system controls the UAV into a constant altitude, heading and speed flight until the operator instructs a maneuver. While in RDC mode, the operator can either pseudo-manually direct the UAV using the control stick (e.g. joystick) or the operator can program a fixed heading, altitude and speed using the control options provided in selections pane 2630.

 

[0391] The described Intelligent displays with smart variables represent an effective approach to actively displaying information for different types of vehicles. However, a problem can arise when a new vehicle is integrated into the ground control station with a completely foreign command and control interface. Under these circumstances, the ground control station is not concerned about displaying data, but is tasked to provide a command and control interface for the operator to perform the required operations. This conundrum is the motivation for another embodiment of the present invention, namely, the integration of vehicle specific panels in the ground control station.

 

[0392] In one embodiment, a generic vehicle class (GVC) is illustratively a software component that provides a rapid development environment API to add new vehicle classes and types to the ground control station. The GVC also illustratively serves as a software construct that allows the inclusion of multiple vehicles within the ground control station framework. One of the variables in the application is a vector of pointers to a generic vehicle class. This list is constructed by allocating new specific vehicles and returning a type case to the base generic vehicle class. When a new vehicle is integrated into the ground control station, the generic vehicle class provides all of the virtual functions to integrate with system control components (e.g., to integrate with a map display, a communications package, PCIG imagery and/or appropriate display windows). An important object in the application framework is illustratively a pointer to the current vehicle generic class. When the user switches vehicles, this pointer is updated and all displays grab the appropriate smart variables from the pointer to the new base class. This is the mechanism by which windows immediately update to the current vehicle information whenever the user switches vehicles. The default windows use the pointer to the current vehicle to grab information. In this manner, if the user switches to a new vehicle with a different set of datalink variables, that fact is immediately apparent on the display windows.

 

The Examiner particularly failed to even make an attempt to point out the following limitation in claim 12:

whereas the selected phases of the flight of said unmanned aerial vehicle comprise:

 

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

 

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

Again, not only do the Duggan citations fail to support a broadest reasonable interpretation (or even a broadest possible interpretation) for the Examiner’s assertion, they amount to a series of non sequiturs.

 

Claim 13 (Dependent)

 

Claim 13 is a dependent claim, dependent on Claim 12. Margolin has shown that Claim 12 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 13 is non-obvious.

 

Claim 14 (Dependent)

 

Claim 14 is a dependent claim, dependent on Claim 12. Margolin has shown that Claim 12 is nonobvious. Therefore, under 2143.03 All Claim Limitations Must Be Considered, Claim 14 is non-obvious.

 

 

The remaining grounds come from the section Response to Arguments on page 10 in the Office Action of February 15, 2011 (See Evidence Appendix Exhibit 6 at 445). By making the Office Action final the Examiner denied Margolin the opportunity to respond to the Examiner’s new arguments. In the telephone interview with the Examiner on or about March 2, 2011 Margolin asked the Examiner to withdraw making the Office Action final so Margolin could respond. The Examiner refused. (See Evidence Appendix Exhibit 7 at 452.) After a telephone interview with the Examiner’s SPE on or about March 22, 2011 the Examiner’s SPE distinguished the section in the Office Action of February 15, 2011 “Response to Arguments” with the Formal Rejection in “Claim Rejections” and stated that “Response to Arguments” was not subject to Rule 706.07(a). (See Evidence Appendix Exhibit 8 at 457.)

 

The Examiner’s Response to Arguments contains arguments that he could have made in the Office Action of September 1, 2010. If he had done so, Margolin would have had the opportunity to refute them and produce new evidence. The Examiner denied Margolin this opportunity.

 

If, as the Examiner’s SPE stated, the Examiner’s Response to Arguments is not part of the Formal Rejection, then the Board of Appeals should either order that the Response to Arguments be striken from the Office Action or they should simply ignore them. If not, then the Board of Appeals should consider the following.


Ground B. 

 

Whether Margolin had a duty to define the term “civilian airspace” or whether he was entitled to use the common meaning of the term.

 

In the Office Action of February 15, 2011 on page 10 (Evidence Appendix Exhibit 6 at 445) the Examiner stated:

Applicant further argues that the prior art do not disclose flying an unmanned aerial vehicle (i.e. an aircraft) in civilian airspace. The examiner does not acquiesce to applicant's remarks. The prior art clearly shows flying an unmanned aerial vehicle (i.e. an aircraft) in civilian airspace since the air space in which the vehicle is flown is not restricted. As further noted applicant fails to provide a particular meaning attached to "civilian airspace".

 

{Emphasis added}

 

 While an Applicant is permitted to be his own lexicographer, he is not required to be one. He may rely on the common meanings of words. From MPEP § 2111.01 Plain Meaning [R-5] - 2100 Patentability:

I.    THE WORDS OF A CLAIM MUST BE GIVEN THEIR "PLAIN MEANING" UNLESS **>SUCH MEANING IS INCONSISTENT WITH< THE SPECIFICATION

 

**>Although< claims of issued patents are interpreted in light of the specification, prosecution history, prior art and other claims, this is not the mode of claim interpretation to be applied during examination. During examination, the claims must be interpreted as broadly as their terms reasonably allow. In re American Academy of Science Tech Center, 367 F.3d 1359, 1369, 70 USPQ2d 1827, 1834 (Fed. Cir. 2004) (The USPTO uses a different standard for construing claims than that used by district courts; during examination the USPTO must give claims their broadest reasonable interpretation >in light of the specification<.). This means that the words of the claim must be given their plain meaning unless **>the plain meaning is inconsistent with< the specification. In re Zletz, 893 F.2d 319, 321, 13 USPQ2d 1320, 1322 (Fed. Cir. 1989) (discussed below); Chef America, Inc. v. Lamb-Weston, Inc., 358 F.3d 1371, 1372, 69 USPQ2d 1857 (Fed. Cir. 2004) (Ordinary, simple English words whose meaning is clear and unquestionable, absent any indication that their use in a particular context changes their meaning, are construed to mean exactly what they say. Thus, "heating the resulting batter-coated dough to a temperature in the range of about 400oF to 850oF" required heating the dough, rather than the air inside an oven, to the specified temperature.). **

 

 {Emphasis added}

 

Part II does not apply here:

II.    IT IS IMPROPER TO IMPORT CLAIM LIMITATIONS FROM THE SPECIFICATION

 

Part III does:

III.    < "PLAIN MEANING" REFERS TO THE ORDINARY AND CUSTOMARY MEAN-ING GIVEN TO THE TERM BY THOSE OF ORDINARY SKILL IN THE ART

"[T]he ordinary and customary meaning of a claim term is the meaning that the term would have to a person of ordinary skill in the art in question at the time of the invention, i.e., as of the effective filing date of the patent application." Phillips v. AWH Corp., *>415 F.3d 1303, 1313<, 75 USPQ2d 1321>, 1326< (Fed. Cir. 2005) (en banc). Sunrace Roots Enter. Co. v. SRAM Corp., 336 F.3d 1298, 1302, 67 USPQ2d 1438, 1441 (Fed. Cir. 2003); Brookhill-Wilk 1, LLC v. Intuitive Surgical, Inc., 334 F.3d 1294, 1298 67 USPQ2d 1132, 1136 (Fed. Cir. 2003)("In the absence of an express intent to impart a novel meaning to the claim terms, the words are presumed to take on the ordinary and customary meanings attributed to them by those of ordinary skill in the art."). It is the use of the words in the context of the written description and customarily by those skilled in the relevant art that accurately reflects both the "ordinary" and the "customary" meaning of the terms in the claims. Ferguson Beauregard/Logic Controls v. Mega Systems, 350 F.3d 1327, 1338, 69 USPQ2d 1001, 1009 (Fed. Cir. 2003) (Dictionary definitions were used to determine the ordinary and customary meaning of the words "normal" and "predetermine" to those skilled in the art. In construing claim terms, the general meanings gleaned from reference sources, such as dictionaries, must always be compared against the use of the terms in context, and the intrinsic record must always be consulted to identify which of the different possible dictionary meanings is most consistent with the use of the words by the inventor.); ACTV, Inc. v. The Walt Disney Company, 346 F.3d 1082, 1092, 68 USPQ2d 1516, 1524 (Fed. Cir. 2003) (Since there was no >express< definition given for the term "URL" in the specification, the term should be given its broadest reasonable interpretation >consistent with the intrinsic record< and take on the ordinary and customary meaning attributed to it by those of ordinary skill in the art; thus, the term "URL" was held to encompass both relative and absolute URLs.); and E-Pass Technologies, Inc. v. 3Com Corporation, 343 F.3d 1364, 1368, 67 USPQ2d 1947, 1949 (Fed. Cir. 2003) (Where no explicit definition for the term "electronic multi-function card" was given in the specification, this term should be given its ordinary meaning and broadest reasonable interpretation; the term should not be limited to the industry standard definition of credit card where there is no suggestion that this definition applies to the electronic multi-function card as claimed, and should not be limited to preferred embodiments in the specification.).

The ordinary and customary meaning of a term may be evidenced by a variety of sources, >including "the words of the claims themselves, the remainder of the specification, the prosecution history, and extrinsic evidence concerning relevant scientific principles, the meaning of technical terms, and the state of the art."< Phillips v. AWH Corp., *>415 F.3d at 1314<, 75 USPQ2d **>at 1327.< If extrinsic reference sources, such as dictionaries, evidence more than one definition for the term, the intrinsic record must be consulted to identify which of the different possible definitions is most consistent with applicant's use of the terms. Brookhill-Wilk 1, 334 F. 3d at 1300, 67 USPQ2d at 1137; see also Renishaw PLC v. Marposs Societa" per Azioni, 158 F.3d 1243, 1250, 48 USPQ2d 1117, 1122 (Fed. Cir. 1998) ("Where there are several common meanings for a claim term, the patent disclosure serves to point away from the improper meanings and toward the proper meanings.") and Vitronics Corp. v. Conceptronic Inc., 90 F.3d 1576, 1583, 39 USPQ2d 1573, 1577 (Fed. Cir. 1996) (construing the term "solder reflow temperature" to mean "peak reflow temperature" of solder rather than the "liquidus temperature" of solder in order to remain consistent with the specification.). If more than one extrinsic definition is consistent with the use of the words in the intrinsic record, the claim terms may be construed to encompass all consistent meanings. ** See *>e.g.,< Rexnord Corp. v. Laitram Corp., 274 F.3d 1336, 1342, 60 USPQ2d 1851, 1854 (Fed. Cir. 2001)(explaining the court's analytical process for determining the meaning of disputed claim terms); Toro Co. v. White Consol. Indus., Inc., 199 F.3d 1295, 1299, 53 USPQ2d 1065, 1067 (Fed. Cir. 1999)("[W]ords in patent claims are given their ordinary meaning in the usage of the field of the invention, unless the text of the patent makes clear that a word was used with a special meaning."). Compare MSM Investments Co. v. Carolwood Corp., 259 F.3d 1335, 1339-40, 59 USPQ2d 1856, 1859-60 (Fed. Cir. 2001) (Claims directed to a method of feeding an animal a beneficial amount of methylsulfonylmethane (MSM) to enhance the animal's diet were held anticipated by prior oral administration of MSM to human patients to relieve pain. Although the ordinary meaning of "feeding" is limited to provision of food or nourishment, the broad definition of "food" in the written description warranted finding that the claimed method encompasses the use of MSM for both nutritional and pharmacological purposes.); and Rapoport v. Dement, 254 F.3d 1053, 1059-60, 59 USPQ2d 1215, 1219-20 (Fed. Cir. 2001) (Both intrinsic evidence and the plain meaning of the term "method for treatment of sleep apneas" supported construction of the term as being limited to treatment of the underlying sleep apnea disorder itself, and not encompassing treatment of anxiety and other secondary symptoms related to sleep apnea.).

{Emphasis added}

 

The term “civilian airspace” is commonly used in the aerospace community.

 

The reference Sensing Requirements for Unmanned Air Vehicles, AFRL's Air Vehicles Directorate, Control Sciences Division, Systems Development Branch, Wright-Patterson AFB OH,  June 2004, was filed with the application. (See Evidence Appendix Exhibit 9 at 462 and Exhibit 10 at 465.)

 

The very first paragraph refers to “civilian airspace.”

Engineers from the Air Vehicles Directorate transferred unmanned air vehicle (UAV) sensing system requirements for airspace operations to civilian UAV users and developers. These requirements represent design goals on which to base future sensing subsystem designs, filling an omission in UAV technology planning. Directorate engineers are continuing to develop the technologies that will enable future UAVs to coexist with manned aircraft in both military and civilian airspace. Incorporating these requirements will ensure that engineers design future UAVs to detect possible conflicts, such as midair collisions or runway incursions, and take action to avoid them.

 

{Emphasis added}

 

Here it is again in the third paragraph.

With this goal in mind, directorate engineers worked with Northrop Grumman Corporation (NGC) engineers to establish, iterate, and finalize sensing system performance requirements for the broad range of future Air Force missions. During this collaborative process, directorate engineers noted that many mission elements were similar to civilian airspace operations tasks, and that the requirements they were developing were directly applicable to civilian UAV technology.

 

{Emphasis added}

 

Thus, there is civilian airspace and there is military airspace.

 

This is consistent with the remarks made by FAA Administrator Babbit in a speech he gave November 18, 2009. (See Evidence Appendix Exhibit 14 at 498).

 

In the second paragraph he says:

So if we are direct with ourselves here, as of today, unmanned aircraft systems are not ready for seamless or routine use yet in civilian airspace. The idea of pilots flying remotely has been around for a long time. And it is, I truly believe, the way of the future. But where we are, on numerous fronts, they’re not ready for open access to the NAS and we can’t give you the thumbs up.

 

Indeed, he equates “civilian airspace” with the NAS. (NAS is the “National Airspace System.”)

 

He does it again in the next paragraph.

And you know that I’m not telling you anything that your technical folks aren’t already telling you. While the UAS is undoubtedly the way of the future, my concern must be on today, and right now, the era of the unmanned aircraft system in civilian airspace is just not here yet. Much as we’d all wish the case were different, the level of technical maturity isn’t where it needs to be for full operation in the NAS.

{Emphasis added}

 

And in paragraph 11:

That kind of scenario notwithstanding, I think unmanned aircraft systems are here to stay. In FY-09, there were about 20,000 flights in civilian airspace for a total of over 2,500 hours. And the number of operations that have been granted has more than tripled since 2007. But in order for us to get to the place where the UAS can become a viable, accepted part of the national airspace system, we have to make sure that sense-and-avoid is more than a given — it must be a guarantee.

 

Thus the term “civilian airspace” is commonly used in the aerospace community (which includes the FAA and the Air Force) and Margolin is entitled to its commonly used (and plain) meaning.

 

Margolin used the term “civilian airspace” because the military controls its own airspace (such as around its bases) and makes its own rules. The Margolin application documents the difficulties in flying UAVS in civilian airspace in BACKGROUND OF THE INVENTION - Current Practice. (See page 2, line 33 - page 4, line 21.) That is why Margolin’s invention is directed to safely flying UAVs in civilian airspace.

 

The significance of the Examiner’s strategy in denying Margolin the common use of the term can be found in Ex parte MAURICE GIVENS Appeal 2009-003414, BPAI Informative Decision, Decided: August 6, 2009. (See Evidence Appendix Exhibit 13 at 493.) 

 

In Givens (Evidence Appendix Exhibit 13 at 494):

     The Examiner rejected claims 1-15 under 35 U.S.C. § 102(e) based upon the teachings of Lin.

 

     The only contention is whether Lin teaches a sub-band spectral subtractive routine external to an LMS-based adaptive filter (App. Br. 7-9; Reply Br. 16-17; Ans. 11-12).

 

     The Examiner finds that Lin teaches an LMS adaptive noise canceller 1412 that includes a sub-band spectral subtraction routine 1410 (Ans. 13). The Examiner further finds that Appellant has not provided a specific definition of “sub-band spectral subtractive routine” and thus, giving the term its broadest reasonable interpretation, the term can include any adaptive filter (Ans. 12). We cannot agree.

 

     Appellant’s Specification explains that “sub-band spectral subtraction algorithms are . . . known to those skilled in the art” in paragraph [0023], sets forth the sub-band spectral subtractive mechanism in paragraph [0032], and also sets forth the function that implements the sub-band spectral noisereduction algorithm (Appendix-Spec: 21-22). Although Appellant’s Specification does not specifically define the term “sub-band spectral

subtractive routine,” this is a specific claim term for a specific type of filtering (Spec. ¶[0032]). Any interpretation that fails to give weight to “sub-band,” “spectral,” “subtractive,” and “routine” deprives the words in this claim term of their normal meaning. Thus, the “sub-band spectral subtractive routine” does not include just any adaptive filter, but rather refers to a specific filtering routine. Further, the output from Lin’s LMS based

adaption circuit is fed to a summer 1124, 1224 (Lin Fig. 14), not a sub-band spectral subtractive routine. A summer is an additive circuit and not a subtractive circuit. Also, Lin does not describe the summer as operating on a sub-band. Thus, because Lin does not disclose each and every element of Appellant’s invention, Lin does not anticipate claims 1-15. RCA Corp. v. Appl. Dig. Data Sys., Inc., 730 F.2d 1440, 1444 (Fed. Cir. 1984).

 

Thus, the Examiner’s assertion that Margolin does not define “civilian airspace” (and is not entitled to the common meaning of the term) is for the purpose of using a broader interpretation of the prior art than it merits. Indeed, as has been shown, the Examiner has used even more than the broadest possible interpretation of the prior art.

 

 

Ground C.  

 

Whether Margolin had a duty to define “safety” or whether he was entitled to use the common meaning of the term; and whether Margolin defined a particular level of safety.

 

In the Office Action of February 15, 2011 on page 10 (Evidence Appendix Exhibit 6 at 446) the Examiner stated:

Some of applicant's remarks are that the prior art does not recite the phrase, "safely flying an unmanned aerial vehicle in civilian airspace comprising:...". Applicant thus insists that the rejection is conclusory and is not supported. The examiner disagrees and notes that any particular level of safety is not described or disclosed in the specification nor is there any meaning provided for " civilian airspace" or "safety".

 

Margolin does define safety as well as the particular level of safety. In Application page 4, lines 24-26 (Evidence Appendix Exhibit 1 at 64):

[010]   It is important when flying a UAV in an airspace shared with other aircraft, both civilian and military, that collisions during all phases of flight (including taking off and landing) not happen.

This is consistent with the aerospace community’s use of the term. MPEP § 2111.01 Parts I and III cited above apply here as well, as does the above cited reference Sensing Requirements for Unmanned Air Vehicles, AFRL's Air Vehicles Directorate, Control Sciences Division, Systems Development Branch, Wright-Patterson AFB OH,  June 2004, which was filed with the application. (See Evidence Appendix Exhibit 9 at 462 and Exhibit 10 at 465.)

Engineers from the Air Vehicles Directorate transferred unmanned air vehicle (UAV) sensing system requirements for airspace operations to civilian UAV users and developers. These requirements represent design goals on which to base future sensing subsystem designs, filling an omission in UAV technology planning. Directorate engineers are continuing to develop the technologies that will enable future UAVs to coexist with manned aircraft in both military and civilian airspace. Incorporating these requirements will ensure that engineers design future UAVs to detect possible conflicts, such as midair collisions or runway incursions, and take action to avoid them.

 

{Emphasis added}

 

Another reference filed by Margolin in his patent application is Presentation: Developing Sense & Avoid Requirements for Meeting an Equivalent Level of Safety given by RUSS WOLFE, Technology IPT Lead, Access 5 Project at UVS Tech 2006. (January 18, 2006). (See Evidence Appendix Exhibit Appendix Exhibit 9 at 462 and Exhibit 11 at 469.)

 

Page 7 of the presentation (Evidence Appendix Exhibit 11 at 475) says:

Task 1: ELOS Definition Document

Definition and Measures of Performance

Definition: “Equivalent level of safety to manned aircraft see-and-avoid” is the capability to provide situational awareness with adequate time to detect conflicting traffic and the ability to take appropriate action necessary to avoid collisions.”

 

And there is only one level of safety.

 

In the remarks cited above by FAA Administrator Babbit in a speech he gave November 18, 2009. (See Evidence Appendix Exhibit 14 at 498) he said:

Good afternoon, and thank you, John [Langford, Chairman & President, Aurora Flight Sciences]. It’s an exciting time in aviation and to be involved with introducing new technology into the National Airspace System. It’s also a good time to be thinking and talking about personal and professional responsibility — something I have unfortunately had to do too much of lately. But we all — every professional in aviation — have a shared

responsibility to make this system as absolutely as safe as it can be, and never to just a level where we would ever say, “We could do more, but this is safe enough”.

 

{Emphasis added}

 

Again, the significance of the Examiner’s strategy in denying Margolin the common use of the term can be found in Ex parte MAURICE GIVENS Appeal 2009-003414, BPAI Informative Decision, Decided: August 6, 2009. (See Evidence Appendix Exhibit 13 at 494.) 

 

The Examiner’s assertion that Margolin does not define “safety” or “a particular level of safety” (and is not entitled to the common meaning of the terms) is for the purpose of using a broader interpretation of the prior art than it merits. Indeed, as has been shown, the Examiner has used even more than the broadest possible interpretation of the prior art.

 

Ground D. 

 

Whether the Examiner’s assertion that “It is believed that the aircraft flown in the prior art is flown safely …” (and which is asserted without evidence) is proper.

 

This strikes at the heart of Margolin’s invention. If “ … the aircraft flown in the prior art is flown safely …” then Margolin’s invention is irrelevant. And it would not be useful.

 

According to MPEP § 2144.03 Reliance on Common Knowledge in the Art or "Well Known" Prior Art [R-6] - 2100 Patentability the Examiner’s statement constitutes Taking Official Notice.

 

However, the Examiner has completely failed to follow the rules for Taking Official Notice.

 

1.  The Examiner failed to provide any evidence for his statement. From § 2144.03(A):

Official notice without documentary evidence to support an examiner's conclusion is permissible only in some circumstances. While "official notice" may be relied on, these circumstances should be rare when an application is under final rejection or action under 37 CFR 1.113. Official notice unsupported by documentary evidence should only be taken by the examiner where the facts asserted to be well-known, or to be common knowledge in the art are capable of instant and unquestionable demonstration as being well-known. As noted by the court in In re Ahlert, 424 F.2d 1088, 1091, 165 USPQ 418, 420 (CCPA 1970), the notice of facts beyond the record which may be taken by the examiner must be "capable of such instant and unquestionable demonstration as to defy dispute"

 

2.  The Examiner failed to provide even a technical line of reasoning for making his statement. From MPEP § 2144.03(B):

 

B.    If Official Notice Is Taken of a Fact, Unsupported by Documentary Evidence, the Technical Line of Reasoning Underlying a Decision To Take Such Notice Must Be Clear and Unmistakable

 

**In certain older cases, official notice has been taken of a fact that is asserted to be "common knowledge" without specific reliance on documentary evidence where the fact noticed was readily verifiable, such as when other references of record supported the noticed fact, or where there was nothing of record to contradict it. See In re Soli, 317 F.2d 941, 945-46, 137 USPQ 797, 800 (CCPA 1963) (accepting the examiner's assertion that the use of "a control is standard procedure throughout the entire field of bacteriology" because it was readily verifiable and disclosed in references of record not cited by the Office); …

 

3.  Because the Examiner made his statement in a Final Office Action Margolin was denied the opportunity to challenge the Examiner to provide evidence for his statement as provided for in MPEP § 2144.03(C):

 

C.    If Applicant Challenges a Factual Assertion as Not Properly Officially Noticed or Not Properly Based Upon Common Knowledge, the Examiner Must Support the Finding With Adequate Evidence

 

To adequately traverse such a finding, an applicant must specifically point out the supposed errors in the examiner's action, which would include stating why the noticed fact is not considered to be common knowledge or well-known in the art. See 37 CFR 1.111(b).

 

4.  The Examiner took Official Notice in an Office Action, which constituted both a new issue and a new ground of rejection, and improperly made the Office Action final. From MPEP § 2144.03(D):

D.    Determine Whether the Next Office Action Should Be Made Final

 

If the examiner adds a reference in the next Office action after applicant's rebuttal, and the newly added reference is added only as directly corresponding evidence to support the prior common knowledge finding, and it does not result in a new issue or constitute a new ground of rejection, the Office action may be made final. If no amendments are made to the claims, the examiner must not rely on any other teachings in the reference if the rejection is made final. If the newly cited reference is added for reasons other than to support the prior common knowledge statement and a new ground of rejection is introduced by the examiner that is not necessitated by applicant's amendment of the claims, the rejection may not be made final. See MPEP § 706.07(a).

 

5.  The Examiner’s statement is dead wrong. It was dead wrong in 2006 (Margolin’s priority date) and it is still dead wrong today.

 

Margolin’s application used as exemplars the crash of the Predator patrolling the U.S. Southern border (See Application page 4 lines 13-21) and the crash of the Lockheed Martin Polecat. (See Application page 4, lines 6-9 and the reference from Aviation Week & Space Technology Lockheed Martin’s Polecat UCAV Demonstrator Crashes which was filed with the application and is reproduced here as Evidence Appendix Exhibit 12 at 489.)  The Examiner indicated he had considered the reference in Evidence Appendix Exhibit 9 at 463.

 

Margolin’s Application contained the reference Sensing Requirements for Unmanned Air Vehicles, AFRL Air Vehicles (Evidence Appendix Exhibit 10 at 465). The second paragraph stated that ”Present UAVs cannot detect manned aircraft and conflict situations and, therefore, they cannot share airspace with manned aircraft.”

 

Another reference filed by Margolin in his patent application is Presentation: Developing Sense & Avoid Requirements for Meeting an Equivalent Level of Safety given by RUSS WOLFE, Technology IPT Lead, Access 5 Project at UVS Tech 2006. (January 18, 2006). (See Evidence Appendix Exhibit Appendix Exhibit 9 at 462 and Exhibit 11 at 469.) The conference was held in order to develop UAS Collision Avoidance Initiatives. If UAVS were already being flown safely the conference would not have been necessary.

 

The aerospace community had not solved the problem by November 2009 when FAA Administrator Babbit gave his speech (Evidence Appendix Exhibit 14 at 498) when he said (second paragraph):

So if we are direct with ourselves here, as of today, unmanned aircraft systems are not ready for seamless or routine use yet in civilian airspace. The idea of pilots flying remotely has been around for a long time. And it is, I truly believe, the way of the future. But where we are, on numerous fronts, they’re not ready for open access to the NAS and we can’t give you the thumbs up.

 

Even more recently, an article from TheLedger.com (Lakeland, FL) dated July 6, 2010 details the problems with UAVs. (The article refers to UAVs by the much older term “drones.”) The article is called Pentagon Accident Reports Suggest Military's Drone Aircraft Plagued With Problems and is reproduced in Evidence Appendix Exhibit 15 at 502.

 

The Examiner’s statement, presented without evidence, is contradicted by the evidence shown in the references Margolin filed with his application (and which the Examiner asserted he had considered) and by more recent evidence.


VIII.   CLAIMS APPENDIX

 

A copy of the claims involved in the present appeal is attached hereto as Appendix A.

 

IX.      EVIDENCE APPENDIX

 

The Evidence Appendix is attached as Appendix B. With the exception of Exhibits 2 and 4 the following exhibits were reproduced from the Image File Wrapper. Presumably, the reason they are in the Image File Wrapper is because they were entered by the Examiner. Exhibits 2 and 4 were cited by the Examiner, which counts as being constructively entered by the Examiner.

 

Exhibit 1         Patent Application as filed  ………………………………………..….…  61

Exhibit 2         U.S. Patent 5,904,724  ……………………………………………..…....  87

Exhibit 3         First Office Action on the Merits  ……………………………………...  102

Exhibit 4         U. S. Patent Application 20050004723 (Duggan) …………………….... 115

Exhibit 5         Applicant’s Response to First Office Action  …………………………..  193

Exhibit 6         Second Office Action  ………………………………………………...… 435

Exhibit 7         Applicant’s Summary of Telephone Interview with Examiner ……….... 452

Exhibit 8         Applicant’s Summary of Telephone Interview with Examiner’s SPE  … 457

Exhibit 9         IDS References Considered by Examiner  ..……………………………. 461

Exhibit 10       Sensing Requirements for Unmanned Air Vehicles, AFRL Air Vehicles

                        Directorate  ……………………………………………………………… 465 

Exhibit 11       Developing Sense and Avoid Requirements for Meeting An Equivalent

                        Level of Safety, Russel Wolfe ..………………………………………... 469

Exhibit 12       Article - Lockheed's Polecat UCAV Demonstrator Crashes, Aviation

                        Week & Space Technology, by Amy Butler, 03/19/2007, page 44 ......... 489

 

New evidence is being presented in this appeal. This new evidence is necessary because in the Second Office Action mailed 2/15/2011 the Examiner added a section called Response to Arguments where he expanded the grounds for rejection that he made in the First Office Action. As a result he constructively introduced new grounds for rejection.

1.  Margolin had not amended his claims in his Response to the First Office Action.

2.  The Examiner could have made his new grounds for rejection in his First Office Action but failed to do so.

 

The interests of fairness, as well as Rule 706.07(a), require that Margolin be allowed to introduce this new evidence. 

 

In addition, there is a conflict between MPEP § 41.33 Amendments and affidavits or other evidence after appeal (d)(1) and the Streamlined Procedure for Appeal Brief Review published in the Federal Register Vol. 75, No. 60. (United States Patent and Trademark Office Docket No.: PTO–P–2010–0026.)

 

Prior to the Streamlined Procedure, the Examiner decided if an Appeal Brief was compliant with the Rules.[2]  Under the Streamlined Procedure this is now done by The Chief Judge of the Board of Patent Appeals and Interferences (BPAI) or his designee. Since the Chief Judge probably has more important things to do, compliance will probably be determined by a paralegal.

 

That would be ok except for § 41.33 Amendments and affidavits or other evidence after appeal (d)(1):

(1) An affidavit or other evidence filed after the date of filing an appeal pursuant to § 41.31(a)(1) through (a)(3) and prior to the date of filing a brief pursuant to § 41.37 may be admitted if the examiner determines that the affidavit or other evidence overcomes all rejections under appeal and that a showing of good and sufficient reasons why the affidavit or other evidence is necessary and was not earlier presented has been made.

 

The Chief Judge’s paralegal is not the Examiner and is unlikely to have the authority to  determine whether Margolin’s new evidence overcomes the Examiner’s rejections.[3] 


Exhibit 13       Ex parte MAURICE GIVENS Appeal 2009-003414

          BPAI Informative Decision, Decided: August 6, 2009  ………………... 493

 

Exhibit 14       Speech - "Safety Must Come First"; J. Randolph Babbitt, FAA

                        Administrator; Scottsdale, AZ; November 18, 2009;

                         http://www.faa.gov/news/speeches/news_story.cfm?newsId=10964  …. 498

 

Exhibit 15       Article - Pentagon Accident Reports Suggest Military's Drone

                        Aircraft Plagued With Problems, by David Zucchino, from

                        The Ledger.com, July 6, 2010.

                        http://www.theledger.com/article/20100706/NEWS/7065101  ……….... 502

 

 

X.   RELATED PROCEEDINGS

 

            There are no decisions rendered by a court or by BPAI in this application.

 

 

Respectfully submitted,

 

/Jed Margolin/

 

Jed Margolin

pro se inventor

June 16, 2011

(775) 847-7845

 

___________________________________________________________________________________

 

I hereby certify that this correspondence is being filed through the USPTO’s Electronic Filing System.

 

Date:   June 16, 2011                         Inventor's Signature:   /Jed Margolin/

                       

                                                                                                Jed Margolin


Claims Appendix

 

Claims involved in the Appeal of Application Serial Number 11/736,356

1.   A system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

(a)    a ground station equipped with a synthetic vision system;

(b)   an unmanned aerial vehicle capable of supporting said synthetic vision system;

(c)    a remote pilot operating said ground station;

(d)  a communications link between said unmanned aerial vehicle and said ground station;

(e)  a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot;

 

whereas said remote pilot uses said synthetic vision system to control said unmanned aerial vehicle during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system.

 

2.   The system of claim 1 whereby said selected phases of the flight of said unmanned aerial vehicle comprise:

 

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

 

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

3.   The system of claim 1 further comprising a system onboard said unmanned aerial vehicle for periodically transmitting the identification, location, altitude, and bearing of said unmanned aerial vehicle.

 

4.   The system of claim 1 further comprising a system onboard said unmanned aerial vehicle for providing a communications channel for Air Traffic Control and the pilots of other aircraft to communicate directly with said remote pilot.

 

5.   A system for safely flying an unmanned aerial vehicle in civilian airspace comprising:

(a)     a ground station equipped with a synthetic vision system;

(b)    an unmanned aerial vehicle capable of supporting said synthetic vision system;

(c)     a remote pilot operating said ground station;

(d)  a communications link between said unmanned aerial vehicle and said ground station;

(e)  a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot;

 

whereas said remote pilot uses said synthetic vision system to control said unmanned aerial vehicle during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle said unmanned aerial vehicle is flown using an autonomous control system, and

 

whereas the selected phases of the flight of said unmanned aerial vehicle comprise:

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

6.   The system of claim 5 further comprising a system onboard said unmanned aerial vehicle for periodically transmitting the identification, location, altitude, and bearing of said unmanned aerial vehicle.

 

7.   The system of claim 5 further comprising a system onboard said unmanned aerial vehicle for providing a communications channel for Air Traffic Control and the pilots of other aircraft to communicate directly with said remote pilot.

 

8.   A method for safely flying an unmanned aerial vehicle as part of a unmanned aerial system equipped with a synthetic vision system in civilian airspace comprising the steps of:

 

(a)  using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

 

(b)  providing a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot.

 

9.  The method of claim 8 whereby said selected phases of the flight of said unmanned aerial vehicle comprise:

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

10.   The method of claim 8 further comprising the step of providing a system onboard said unmanned aerial vehicle for periodically transmitting the identification, location, altitude, and bearing of said unmanned aerial vehicle.

 

11.   The method of claim 8 further comprising the step of providing a system onboard said unmanned aerial vehicle for providing a communications channel for Air Traffic Control and the pilots of other aircraft to communicate directly with said remote pilot.


12.   A method for safely flying an unmanned aerial vehicle as part of a unmanned aerial system equipped with a synthetic vision system in civilian airspace comprising the steps of:

 

(a)  using a remote pilot to fly said unmanned aerial vehicle using synthetic vision during at least selected phases of the flight of said unmanned aerial vehicle, and during those phases of the flight of said unmanned aerial vehicle when said synthetic vision system is not used to control said unmanned aerial vehicle an autonomous control system is used to fly said unmanned aerial vehicle;

 

(b)  providing a system onboard said unmanned aerial vehicle for detecting the presence and position of nearby aircraft and communicating this information to said remote pilot;

 

whereas said selected phases of the flight of said unmanned aerial vehicle comprise:

(a)  when said unmanned aerial vehicle is within a selected range of an airport or other designated location and is below a first specified altitude;

(b)  when said unmanned aerial vehicle is outside said selected range of an airport or other designated location and is below a second specified altitude.

 

13.   The method of claim 12 further comprising the step of providing a system onboard said unmanned aerial vehicle for periodically transmitting the identification, location, altitude, and bearing of said unmanned aerial vehicle.

 

14.   The method of claim 12 further comprising the step of providing a system onboard said unmanned aerial vehicle for providing a communications channel for Air Traffic Control and the pilots of other aircraft to communicate directly with said remote pilot.


Appendix B - Evidence Appendix

Exhibit 1         Patent Application as filed  ………………………………………….….  61

Exhibit 2         U.S. Patent 5,904,724 (Margolin)...………………………………….…  87

Exhibit 3         First Office Action on the Merits  …………………………….…….… 102

Exhibit 4         U. S. Patent Application 20050004723 (Duggan) …………………...... 115

Exhibit 5         Applicant’s Response to First Office Action  ………………………..... 193

Exhibit 6         Second Office Action  ………………………………………………… 435

Exhibit 7         Applicant’s Summary of Telephone Interview with Examiner …….… 452

Exhibit 8         Applicant’s Summary of Telephone Interview with

                                    Examiner’s SPE  …………………………………………….... 457

Exhibit 9         IDS References Considered by Examiner  ..………………………..… 461

Exhibit 10       Sensing Requirements for Unmanned Air Vehicles,

                                    AFRL Air Vehicles Directorate  ……………………….….…. 465

Exhibit 11       Developing Sense and Avoid Requirements for Meeting

                                    An Equivalent Level of Safety, Russel Wolfe  ..………….... 469

Exhibit 12       Article - Lockheed's Polecat UCAV Demonstrator Crashes, Aviation

                        Week & Space Technology, by Amy Butler, 03/19/2007, page 44 ..... 489

Exhibit 13       Ex parte MAURICE GIVENS Appeal 2009-003414

                      BPAI Informative Decision, Decided: August 6, 2009  …….. 493

Exhibit 14       Speech - "Safety Must Come First"; J. Randolph Babbitt,

                                    FAA Administrator; November 18, 2009, FAA Web site  .…. 498

Exhibit 15       Article - Pentagon Accident Reports Suggest Military's Drone

                                    Aircraft Plagued With Problems, by David Zucchino, from

                                    The Ledger.com, July 6, 2010.

                                    http://www.theledger.com/article/20100706/NEWS/7065101  .. 502


 



[1] In Margolin’s telephone interview with the Examiner, the Examiner was unaware that Margolin (the current Applicant) is the same Margolin named as the inventor in ‘724. At one point during the interview the Examiner was confused as to whether Margolin was Margolin or Duggan. (See  Summary of Telephone Interview with the Examiner, Evidence Appendix, Exhibit 7 at 452.)

[2] This comes under the category of putting the Fox in charge of the Hen House.

 

[3] As previously noted, the new evidence is necessary to respond the Examiner’s expanded  grounds for rejection, which is why it could not be presented earlier. Margolin will remind the Board that if it does not allow Margolin to submit this new evidence the U.S. District Court for the District of Columbia will. (Hyatt v. Kappos, No. 2007-1066, US Court of Appeals for the Federal Circuit, 2010 US App. LEXIS 23117, 8 November 2010)