Playback device grouping

Lambourne , et al. A

Patent Grant 10387102

U.S. patent number 10,387,102 [Application Number 15/095,145] was granted by the patent office on 2019-08-20 for playback device grouping. This patent grant is currently assigned to Sonos, Inc.. The grantee listed for this patent is Sonos, Inc.. Invention is credited to Robert A. Lambourne, Nicholas A. J. Millington.


View All Diagrams
United States Patent 10,387,102
Lambourne ,   et al. August 20, 2019

Playback device grouping

Abstract

In general, user interfaces for controlling a plurality of multimedia players in groups are disclosed. According to one aspect of the present invention, a user interface is provided to allow a user to group some of the players according to a theme or scene, where each of the players is located in a zone. When the scene is activated, the players in the scene react in a synchronized manner. For example, the players in the scene are all caused to play a multimedia source or music in a playlist, wherein the multimedia source may be located anywhere on a network. The user interface is further configured to illustrate graphically a size of a group, the larger the group appears relatively, the more plays there are in the group.


Inventors: Lambourne; Robert A. (Santa Barbara, CA), Millington; Nicholas A. J. (Santa Barbara, CA)
Applicant:
Name City State Country Type

Sonos, Inc.

Santa Barbara

CA

US
Assignee: Sonos, Inc. (Santa Barbara, CA)
Family ID: 46981809
Appl. No.: 15/095,145
Filed: April 10, 2016

Prior Publication Data

Document Identifier Publication Date
US 20160241983 A1 Aug 18, 2016

Related U.S. Patent Documents

Application Number Filing Date Patent Number Issue Date
14808875 Jul 24, 2015
13907666 Sep 22, 2015 9141645
13619237 Nov 19, 2013 8588949
12035112 Oct 16, 2012 8290603
10861653 Aug 4, 2009 7571014
10816217 Jul 31, 2012 8234395
60490768 Jul 28, 2003

Current U.S. Class: 1/1
Current CPC Class: G11B 27/00 (20130101); G06F 3/0482 (20130101); H04S 7/30 (20130101); G06F 3/162 (20130101); G06F 3/04842 (20130101); G05B 15/02 (20130101); G06F 3/04855 (20130101); G06F 16/27 (20190101); G06F 3/165 (20130101); H04R 27/00 (20130101); G06F 3/04847 (20130101); H04S 7/00 (20130101); H04R 29/008 (20130101); H04R 2227/003 (20130101); H04R 2227/005 (20130101)
Current International Class: G06F 17/00 (20190101); G06F 16/27 (20190101); G06F 3/0484 (20130101); G06F 3/0482 (20130101); G06F 3/16 (20060101); G06F 3/0485 (20130101); G11B 27/00 (20060101); H04R 27/00 (20060101); H04S 7/00 (20060101); G05B 15/02 (20060101); H04R 29/00 (20060101)
Field of Search: ;700/94 ;381/119

References Cited [Referenced By]

U.S. Patent Documents
3956591 May 1976 Gates, Jr.
4105974 August 1978 Rogers
D260764 September 1981 Castagna et al.
4296278 October 1981 Cullison et al.
4306114 December 1981 Callahan
4509211 April 1985 Robbins
D279779 July 1985 Taylor
4530091 July 1985 Crockett
4696037 September 1987 Fierens
4701629 October 1987 Citroen
4712105 December 1987 Koehler
D293671 January 1988 Beaumont
4731814 March 1988 Becker et al.
4816989 March 1989 Finn et al.
4824059 April 1989 Butler
D301037 May 1989 Matsuda
4845751 July 1989 Schwab
D304443 November 1989 Grinyer et al.
D313023 December 1990 Kolenda et al.
D313398 January 1991 Gilchrist
D313600 January 1991 Weber
4994908 February 1991 Kuban et al.
D320598 October 1991 Auerbach et al.
D322609 December 1991 Patton
5086385 February 1992 Launey et al.
D326450 May 1992 Watanabe
D327060 June 1992 Wachob et al.
5151922 September 1992 Weiss
5153579 October 1992 Fisch et al.
D331388 December 1992 Dahnert et al.
5182552 January 1993 Paynting
D333135 February 1993 Wachob et al.
5185680 February 1993 Kakubo
5198603 March 1993 Nishikawa et al.
5237327 August 1993 Saitoh et al.
5239458 August 1993 Suzuki
5272757 December 1993 Scofield et al.
5299266 March 1994 Lumsden
D350531 September 1994 Tsuji
D350962 September 1994 Reardon et al.
5361381 November 1994 Short
5372441 December 1994 Louis
D354059 January 1995 Hendricks
D354751 January 1995 Hersh et al.
D356093 March 1995 McCauley et al.
D356312 March 1995 Althans
D357024 April 1995 Tokiyama et al.
5406634 April 1995 Anderson et al.
5430485 July 1995 Lankford et al.
5440644 August 1995 Farinelli et al.
D362446 September 1995 Gasiorek et al.
5457448 October 1995 Totsuka et al.
D363933 November 1995 Starck
5467342 November 1995 Logston et al.
D364877 December 1995 Tokiyama et al.
D364878 December 1995 Green et al.
D365102 December 1995 Gioscia
D366044 January 1996 Hara et al.
5481251 January 1996 Buys et al.
5491839 February 1996 Schotz
5515345 May 1996 Barreira et al.
5533021 July 1996 Branstad et al.
D372716 August 1996 Thorne
5553147 September 1996 Pineau
5553222 September 1996 Milne et al.
5553314 September 1996 Grube et al.
D377651 January 1997 Biasotti et al.
5596696 January 1997 Tindell et al.
5602992 February 1997 Danneels
5623483 April 1997 Agrawal et al.
5625350 April 1997 Fukatsu et al.
5633871 May 1997 Bloks
D379816 June 1997 Laituri et al.
5636345 June 1997 Valdevit
5640388 June 1997 Woodhead et al.
D380752 July 1997 Hanson
5652749 July 1997 Davenport et al.
D382271 August 1997 Akwiwu
5661665 August 1997 Glass et al.
5661728 August 1997 Finotello et al.
5668884 September 1997 Clair, Jr. et al.
5673323 September 1997 Schotz et al.
D384940 October 1997 Kono et al.
D387352 December 1997 Kaneko et al.
5696896 December 1997 Badovinatz et al.
D388792 January 1998 Nykerk
D389143 January 1998 Wicks
D392641 March 1998 Fenner
5726989 March 1998 Dokic
5732059 March 1998 Katsuyama et al.
D393628 April 1998 Ledbetter et al.
5740235 April 1998 Lester et al.
5742623 April 1998 Nuber et al.
D394659 May 1998 Biasotti et al.
5751819 May 1998 Dorrough
5761320 June 1998 Farinelli et al.
5774016 June 1998 Ketterer
D395889 July 1998 Gerba et al.
5787249 July 1998 Badovinatz et al.
5790543 August 1998 Cloutier
D397996 September 1998 Smith
5808662 September 1998 Kinney et al.
5812201 September 1998 Yoo
5815689 September 1998 Shaw et al.
5818948 October 1998 Gulick
D401587 November 1998 Rudolph
5832024 November 1998 Schotz et al.
5838909 November 1998 Roy et al.
5848152 December 1998 Slipy et al.
5852722 December 1998 Hamilton
D404741 January 1999 Schumaker et al.
D405071 February 1999 Gambaro
5867691 February 1999 Shiraishi
5875233 February 1999 Cox
5875354 February 1999 Charlton et al.
D406847 March 1999 Gerba et al.
D407071 March 1999 Keating
5887143 March 1999 Saito et al.
5905768 May 1999 Maturi et al.
D410927 June 1999 Yamagishi
5917830 June 1999 Chen et al.
D412337 July 1999 Hamano
5923869 July 1999 Kashiwagi et al.
5923902 July 1999 Inagaki
5946343 August 1999 Schotz et al.
5956025 September 1999 Goulden et al.
5956088 September 1999 Shen et al.
5960006 September 1999 Maturi et al.
D415496 October 1999 Gerba et al.
D416021 November 1999 Godette et al.
5984512 November 1999 Jones et al.
5987525 November 1999 Roberts et al.
5987611 November 1999 Freund
5990884 November 1999 Douma et al.
5991307 November 1999 Komuro et al.
5999906 December 1999 Mercs et al.
6009457 December 1999 Moller
6018376 January 2000 Nakatani
D420006 February 2000 Tonino
6026150 February 2000 Frank et al.
6029196 February 2000 Lenz
6031818 February 2000 Lo et al.
6032202 February 2000 Lea et al.
6038614 March 2000 Chan et al.
6046550 April 2000 Ference et al.
6061457 May 2000 Stockhamer
6078725 June 2000 Tanaka
6081266 June 2000 Sciammarella
6088063 July 2000 Shiba
D429246 August 2000 Holma
D430143 August 2000 Renk
6101195 August 2000 Lyons et al.
6108485 August 2000 Kim
6108686 August 2000 Williams, Jr.
6122668 September 2000 Teng et al.
D431552 October 2000 Backs et al.
D432525 October 2000 Beecroft
6127941 October 2000 Van Ryzin
6128318 October 2000 Sato
6148205 November 2000 Cotton
6154772 November 2000 Dunn et al.
6157957 December 2000 Berthaud
6163647 December 2000 Terashima et al.
6169725 January 2001 Gibbs et al.
6175872 January 2001 Neumann et al.
6181383 January 2001 Fox et al.
6185737 February 2001 Northcutt et al.
6195435 February 2001 Kitamura
6195436 February 2001 Scibora et al.
6199169 March 2001 Voth
6212282 April 2001 Mershon
6246701 June 2001 Slattery
6253293 June 2001 Rao et al.
D444475 July 2001 Levey et al.
6255961 July 2001 Van Ryzin et al.
6256554 July 2001 Dilorenzo
6269406 July 2001 Dutcher et al.
6301012 October 2001 White et al.
6308207 October 2001 Tseng et al.
6310652 October 2001 Li et al.
6313879 November 2001 Kubo et al.
6321252 November 2001 Bhola et al.
6324586 November 2001 Johnson
D452520 December 2001 Gotham et al.
6332147 December 2001 Moran et al.
6343028 January 2002 Kuwaoka
6349285 February 2002 Liu et al.
6349339 February 2002 Williams
6349352 February 2002 Lea
6351821 February 2002 Voth
6353172 March 2002 Fay et al.
6356871 March 2002 Hemkumar et al.
6404811 June 2002 Cvetko et al.
6418150 July 2002 Staats
6430353 August 2002 Honda et al.
6442443 August 2002 Fujii et al.
D462339 September 2002 Allen et al.
D462340 September 2002 Allen et al.
D462945 September 2002 Skulley
6446080 September 2002 Van Ryzin et al.
6449642 September 2002 Bourke-Dunphy et al.
6449653 September 2002 Klemets et al.
6456783 September 2002 Ando et al.
6463474 October 2002 Fuh et al.
6466832 October 2002 Zuqert et al.
6469633 October 2002 Wachter
D466108 November 2002 Glodava et al.
6487296 November 2002 Allen et al.
6493832 December 2002 Itakura et al.
D468297 January 2003 Ikeda
6522886 February 2003 Youngs et al.
6526325 February 2003 Sussman et al.
6526411 February 2003 Ward
6535121 March 2003 Mathney et al.
D474763 May 2003 Tozaki et al.
D475993 June 2003 Meyer
D476643 July 2003 Yamagishi
D477310 July 2003 Moransais
6587127 July 2003 Leeke et al.
6598172 July 2003 Vandeusen et al.
D478051 August 2003 Sagawa
D478069 August 2003 Beck et al.
D478896 August 2003 Summers
6611537 August 2003 Edens et al.
6611813 August 2003 Bratton
D479520 September 2003 De Saulles
D481056 October 2003 Kawasaki et al.
6631410 October 2003 Kowalski et al.
6636269 October 2003 Baldwin
6639584 October 2003 Li
6653899 November 2003 Organvidez et al.
6654720 November 2003 Graham et al.
6654956 November 2003 Trinh et al.
6658091 December 2003 Naidoo et al.
6674803 January 2004 Kesselring
6684060 January 2004 Curtin
D486145 February 2004 Kaminski et al.
6687664 February 2004 Sussman et al.
6704421 March 2004 Kitamura
6741961 May 2004 Lim
D491925 June 2004 Griesau et al.
6757517 June 2004 Chang
D493148 July 2004 Shibata et al.
6763274 July 2004 Gilbert
D495333 August 2004 Borsboom
6778073 August 2004 Lutter et al.
6778493 August 2004 Ishii
6778869 August 2004 Champion
D496003 September 2004 Spira
D496005 September 2004 Wang
D496335 September 2004 Spira
6795852 September 2004 Kleinrock et al.
D497363 October 2004 Olson et al.
6803964 October 2004 Post et al.
6809635 October 2004 Kaaresoja
D499086 November 2004 Polito
6816104 November 2004 Lin
6816510 November 2004 Banerjee
6816818 November 2004 Wolf et al.
6823225 November 2004 Sass
6826283 November 2004 Wheeler et al.
D499395 December 2004 Hsu
D499718 December 2004 Chen
D500015 December 2004 Gubbe
6836788 December 2004 Kim et al.
6839752 January 2005 Miller et al.
D501477 February 2005 Hall
6859460 February 2005 Chen
6859538 February 2005 Voltz
6873862 March 2005 Reshefsky
6882335 April 2005 Saarinen
D504872 May 2005 Uehara et al.
D504885 May 2005 Zhang et al.
6898642 May 2005 Chafle et al.
6901439 May 2005 Bonasia et al.
D506463 June 2005 Daniels
6907458 June 2005 Tomassetti et al.
6910078 June 2005 Raman et al.
6912610 June 2005 Spencer
6915347 July 2005 Hanko et al.
6917592 July 2005 Ramankutty et al.
6919771 July 2005 Nakajima
6920373 July 2005 Xi et al.
6931557 August 2005 Togawa
6934766 August 2005 Russell
6937988 August 2005 Hemkumar et al.
6970482 November 2005 Kim
6985694 January 2006 De Bonet et al.
6987767 January 2006 Saito
D515072 February 2006 Lee
D515557 February 2006 Okuley
7006758 February 2006 Yamamoto et al.
7007106 February 2006 Flood et al.
7020791 March 2006 Aweya et al.
D518475 April 2006 Yang et al.
7043477 May 2006 Mercer et al.
7043651 May 2006 Aweya et al.
7046677 May 2006 Monta et al.
7047308 May 2006 Deshpande
7054888 May 2006 Lachapelle et al.
7058889 June 2006 Trovato et al.
7068596 June 2006 Mou
D524296 July 2006 Kita
D527375 August 2006 Flora et al.
7092528 August 2006 Patrick et al.
7092694 August 2006 Griep et al.
7096169 August 2006 Crutchfield et al.
7102513 September 2006 Taskin et al.
7106224 September 2006 Knapp et al.
7113999 September 2006 Pestoni et al.
7115017 October 2006 Laursen et al.
7120168 October 2006 Zimmermann
7130316 October 2006 Kovacevic
7130368 October 2006 Aweya et al.
7130608 October 2006 Hollstrom et al.
7130616 October 2006 Janik
7136934 November 2006 Carter et al.
7139981 November 2006 Mayer et al.
7143141 November 2006 Morgan et al.
7143939 December 2006 Henzerling
7146260 December 2006 Preston et al.
7158488 January 2007 Fujimori
7161939 January 2007 Israel et al.
7162315 January 2007 Gilbert
7164694 January 2007 Nodoushani et al.
7167765 January 2007 Janik
7185090 February 2007 Kowalski et al.
7187947 March 2007 White et al.
7188353 March 2007 Crinon
7197148 March 2007 Nourse et al.
7206367 April 2007 Moore
7206618 April 2007 Latto et al.
7206967 April 2007 Marti et al.
7209795 April 2007 Sullivan et al.
7218708 May 2007 Berezowski et al.
7218930 May 2007 Ko et al.
7236739 June 2007 Chang
7236773 June 2007 Thomas
7251533 July 2007 Yoon et al.
7257398 August 2007 Ukita et al.
7260616 August 2007 Cook
7263070 August 2007 Delker et al.
7263110 August 2007 Fujishiro
7277547 October 2007 Delker et al.
7286652 October 2007 Azriel et al.
7289631 October 2007 Ishidoshiro
7293060 November 2007 Komsi
7295548 November 2007 Blank et al.
7305694 December 2007 Commons et al.
7308188 December 2007 Namatame
7310334 December 2007 Fitzgerald et al.
7312785 December 2007 Tsuk et al.
7313593 December 2007 Pulito et al.
7319764 January 2008 Reid et al.
7324857 January 2008 Goddard
7330875 February 2008 Parasnis et al.
7333519 February 2008 Sullivan et al.
7356011 April 2008 Waters et al.
7359006 April 2008 Xiang et al.
7366206 April 2008 Lockridge et al.
7372846 May 2008 Zwack
7383036 June 2008 Kang et al.
7391791 June 2008 Balassanian et al.
7392102 June 2008 Sullivan et al.
7392481 June 2008 Gewickey et al.
7394480 July 2008 Song
7400644 July 2008 Sakamoto et al.
7412499 August 2008 Chang et al.
7428310 September 2008 Park
7430181 September 2008 Hong
7433324 October 2008 Switzer et al.
7434166 October 2008 Acharya et al.
7457948 November 2008 Bilicksa et al.
7469139 December 2008 Van De Groenendaal
7472058 December 2008 Tseng et al.
7474677 January 2009 Trott
7483538 January 2009 McCarty et al.
7483540 January 2009 Rabinowitz et al.
7483958 January 2009 Elabbady et al.
7492912 February 2009 Chung et al.
7505889 March 2009 Salmonsen et al.
7509181 March 2009 Champion
7519667 April 2009 Capps
7548744 June 2009 Oesterling et al.
7548851 June 2009 Lau et al.
7558224 July 2009 Surazski et al.
7558635 July 2009 Thiel et al.
7571014 August 2009 Lambourne et al.
7574274 August 2009 Holmes
7599685 October 2009 Goldberg et al.
7606174 October 2009 Ochi et al.
7607091 October 2009 Song et al.
7627825 December 2009 Kakuda
7630501 December 2009 Blank et al.
7631119 December 2009 Moore et al.
7643894 January 2010 Braithwaite et al.
7653344 January 2010 Feldman et al.
7657224 February 2010 Goldberg et al.
7657644 February 2010 Zheng
7657910 February 2010 McAulay et al.
7665115 February 2010 Gallo et al.
7668990 February 2010 Krzyzanowski et al.
7669113 February 2010 Moore et al.
7669219 February 2010 Scott, III
7672470 March 2010 Lee
7675943 March 2010 Mosig et al.
7676044 March 2010 Sasaki et al.
7676142 March 2010 Hung
7688306 March 2010 Wehrenberg et al.
7689304 March 2010 Sasaki
7689305 March 2010 Kreifeldt et al.
7702279 April 2010 Ko et al.
7702403 April 2010 Gladwin et al.
7710941 May 2010 Rietschel et al.
7711774 May 2010 Rothschild
7720096 May 2010 Klemets
7721032 May 2010 Bushell et al.
7742740 June 2010 Goldberg et al.
7743009 June 2010 Hangartner et al.
7746906 June 2010 Jinzaki et al.
7756743 July 2010 Lapcevic
7761176 July 2010 Ben-Yaacov et al.
7765315 July 2010 Batson et al.
RE41608 August 2010 Blair et al.
7793206 September 2010 Lim et al.
7827259 November 2010 Heller et al.
7831054 November 2010 Ball et al.
7835689 November 2010 Goldberg et al.
7853341 December 2010 McCarty et al.
7865137 January 2011 Goldberg et al.
7882234 February 2011 Watanabe et al.
7885622 February 2011 Krampf et al.
7907819 March 2011 Ando et al.
7916877 March 2011 Goldberg et al.
7917082 March 2011 Goldberg et al.
7933418 April 2011 Morishima
7934239 April 2011 Dagman
7945143 May 2011 Yahata et al.
7945636 May 2011 Nelson et al.
7945708 May 2011 Ohkita
7958441 June 2011 Heller et al.
7966388 June 2011 Pugaczewski et al.
7987294 July 2011 Bryce et al.
7995732 August 2011 Koch et al.
7996566 August 2011 Sylvain et al.
7996588 August 2011 Subbiah et al.
8014423 September 2011 Thaler et al.
8015306 September 2011 Bowman
8020023 September 2011 Millington et al.
8023663 September 2011 Goldberg
8028038 September 2011 Weel
8028323 September 2011 Weel
8041062 October 2011 Cohen et al.
8045721 October 2011 Burgan et al.
8045952 October 2011 Qureshey et al.
8050203 November 2011 Jacobsen et al.
8050652 November 2011 Qureshey et al.
8055364 November 2011 Champion
8074253 December 2011 Nathan
8086752 December 2011 Millington et al.
8090317 January 2012 Burge et al.
8103009 January 2012 McCarty et al.
8111132 February 2012 Allen et al.
8112032 February 2012 Ko et al.
8116476 February 2012 Inohara
8126172 February 2012 Horbach et al.
8131389 March 2012 Hardwick et al.
8131390 March 2012 Braithwaite et al.
8144883 March 2012 Pdersen et al.
8148622 April 2012 Rothkopf et al.
8150079 April 2012 Maeda et al.
8169938 May 2012 Duchscher et al.
8170222 May 2012 Dunko
8170260 May 2012 Reining et al.
8175297 May 2012 Ho et al.
8185674 May 2012 Moore et al.
8194874 June 2012 Starobin et al.
8204890 June 2012 Gogan
8208653 June 2012 Eo et al.
8214447 July 2012 Deslippe et al.
8214740 July 2012 Johnson
8214873 July 2012 Weel
8218790 July 2012 Bull et al.
8230099 July 2012 Weel
8233029 July 2012 Yoshida et al.
8233648 July 2012 Sorek et al.
8234305 July 2012 Seligmann et al.
8234395 July 2012 Millington et al.
8239748 August 2012 Moore et al.
8275910 September 2012 Hauck
8279709 October 2012 Choisel et al.
8281001 October 2012 Busam et al.
8285404 October 2012 Kekki
8290603 October 2012 Lambourne
8300845 October 2012 Zurek et al.
8311226 November 2012 Lorgeoux et al.
8315555 November 2012 Ko et al.
8316147 November 2012 Batson et al.
8325931 December 2012 Howard et al.
8326951 December 2012 Millington et al.
8340330 December 2012 Yoon et al.
8345709 January 2013 Nitzpon et al.
8364295 January 2013 Beckmann et al.
8370678 February 2013 Millington et al.
8374595 February 2013 Chien et al.
8407623 March 2013 Kerr et al.
8411883 April 2013 Matsumoto
8423659 April 2013 Millington
8423893 April 2013 Ramsay et al.
8432851 April 2013 Xu et al.
8433076 April 2013 Zurek et al.
8442239 May 2013 Bruelle-Drews et al.
8457334 June 2013 Yoon et al.
8463184 June 2013 Dua
8463875 June 2013 Katz et al.
8473844 June 2013 Kreifeldt et al.
8477958 July 2013 Moeller et al.
8483853 July 2013 Lambourne
8509211 August 2013 Trotter et al.
8520870 August 2013 Sato et al.
8565455 October 2013 Worrell et al.
8577048 November 2013 Chaikin et al.
8588949 November 2013 Lambourne et al.
8600084 December 2013 Garrett
8611559 December 2013 Sanders
8615091 December 2013 Terwal
8639830 January 2014 Bowman
8654995 February 2014 Silber et al.
8672744 March 2014 Gronkowski et al.
8683009 March 2014 Ng et al.
8689036 April 2014 Millington et al.
8731206 May 2014 Park
8750282 June 2014 Gelter et al.
8751026 June 2014 Sato et al.
8762565 June 2014 Togashi et al.
8775546 July 2014 Millington
8818538 August 2014 Sakata
8819554 August 2014 Basso et al.
8831761 September 2014 Kemp et al.
8843586 September 2014 Pantos et al.
8861739 October 2014 Ojanpera
8868698 October 2014 Millington et al.
8885851 November 2014 Westenbroek
8904066 December 2014 Moore et al.
8917877 December 2014 Haaff et al.
8930006 January 2015 Haatainen
8934647 January 2015 Joyce et al.
8934655 January 2015 Breen et al.
8938637 January 2015 Millington et al.
8942252 January 2015 Balassanian et al.
8942395 January 2015 Lissaman et al.
8954177 February 2015 Sanders
8965544 February 2015 Ramsay
8966394 February 2015 Gates et al.
9042556 May 2015 Kallai et al.
9130770 September 2015 Millington et al.
9137602 September 2015 Mayman et al.
9160965 October 2015 Redmann et al.
9195258 November 2015 Millington
9456243 September 2016 Hughes et al.
9507780 November 2016 Rothkopf et al.
2001/0001160 May 2001 Shoff et al.
2001/0009604 July 2001 Ando et al.
2001/0022823 September 2001 Renaud
2001/0027498 October 2001 Van De Meulenhof et al.
2001/0032188 October 2001 Miyabe et al.
2001/0042107 November 2001 Palm
2001/0043456 November 2001 Atkinson
2001/0046235 November 2001 Trevitt et al.
2001/0047377 November 2001 Sincaglia et al.
2001/0050991 December 2001 Eves
2002/0002039 January 2002 Qureshey et al.
2002/0002562 January 2002 Moran et al.
2002/0002565 January 2002 Ohyama
2002/0003548 January 2002 Krusche et al.
2002/0015003 February 2002 Kato et al.
2002/0022453 February 2002 Balog et al.
2002/0026442 February 2002 Lipscomb et al.
2002/0034374 March 2002 Barton
2002/0035621 March 2002 Zintel et al.
2002/0042844 April 2002 Chiazzese
2002/0049843 April 2002 Barone et al.
2002/0062406 May 2002 Chang et al.
2002/0065926 May 2002 Hackney et al.
2002/0067909 June 2002 Iivonen
2002/0072816 June 2002 Shdema et al.
2002/0072817 June 2002 Champion
2002/0073228 June 2002 Cognet et al.
2002/0078293 June 2002 Kou et al.
2002/0080783 June 2002 Fujimori
2002/0090914 July 2002 Kang et al.
2002/0093478 July 2002 Yeh
2002/0095460 July 2002 Benson
2002/0098878 July 2002 Mooney et al.
2002/0101357 August 2002 Gharapetian
2002/0103635 August 2002 Mesarovic et al.
2002/0109710 August 2002 Holtz et al.
2002/0112244 August 2002 Liou et al.
2002/0114354 August 2002 Sinha et al.
2002/0114359 August 2002 Ibaraki et al.
2002/0124097 September 2002 Isely et al.
2002/0124182 September 2002 Bacso et al.
2002/0129156 September 2002 Yoshikawa
2002/0131398 September 2002 Taylor
2002/0131761 September 2002 Kawasaki et al.
2002/0136335 September 2002 Liou et al.
2002/0137505 September 2002 Eiche et al.
2002/0143998 October 2002 Rajagopal et al.
2002/0150053 October 2002 Gray et al.
2002/0159596 October 2002 Durand et al.
2002/0163361 November 2002 Parkin
2002/0165721 November 2002 Chang et al.
2002/0165921 November 2002 Sapieyevski
2002/0168938 November 2002 Chang
2002/0173273 November 2002 Spurgat et al.
2002/0177411 November 2002 Yajima et al.
2002/0181355 December 2002 Shikunami et al.
2002/0184310 December 2002 Traversat et al.
2002/0188762 December 2002 Tomassetti et al.
2002/0194260 December 2002 Headley et al.
2002/0194309 December 2002 Carter et al.
2003/0002609 January 2003 Faller et al.
2003/0008616 January 2003 Anderson
2003/0014486 January 2003 May
2003/0018797 January 2003 Dunning et al.
2003/0020763 January 2003 Mayer et al.
2003/0023741 January 2003 Tomassetti et al.
2003/0035072 February 2003 Hagg
2003/0035444 February 2003 Zwack
2003/0041173 February 2003 Hoyle
2003/0041174 February 2003 Wen et al.
2003/0043856 March 2003 Lakaniemi et al.
2003/0043924 March 2003 Haddad et al.
2003/0050058 March 2003 Walsh et al.
2003/0055892 March 2003 Huitema et al.
2003/0061428 March 2003 Garney et al.
2003/0063528 April 2003 Ogikubo
2003/0063755 April 2003 Nourse et al.
2003/0066094 April 2003 Van Der Schaar et al.
2003/0067437 April 2003 McClintock et al.
2003/0073432 April 2003 Meade
2003/0097478 May 2003 King
2003/0099212 May 2003 Anjum et al.
2003/0099221 May 2003 Rhee
2003/0101253 May 2003 Saito et al.
2003/0103088 June 2003 Dresti et al.
2003/0109270 June 2003 Shorty
2003/0110329 June 2003 Higaki et al.
2003/0118158 June 2003 Hattori
2003/0123853 July 2003 Iwahara et al.
2003/0126211 July 2003 Anttila et al.
2003/0135822 July 2003 Evans
2003/0157951 August 2003 Hasty
2003/0167335 September 2003 Alexander
2003/0172123 September 2003 Polan et al.
2003/0179780 September 2003 Walker et al.
2003/0182254 September 2003 Plastina et al.
2003/0185400 October 2003 Yoshizawa et al.
2003/0187657 October 2003 Erhart et al.
2003/0195964 October 2003 Mane
2003/0198254 October 2003 Sullivan et al.
2003/0198255 October 2003 Sullivan et al.
2003/0198257 October 2003 Sullivan et al.
2003/0200001 October 2003 Goddard
2003/0204273 October 2003 Dinker et al.
2003/0204509 October 2003 Dinker et al.
2003/0210347 November 2003 Kondo
2003/0210796 November 2003 McCarty et al.
2003/0212802 November 2003 Rector et al.
2003/0219007 November 2003 Barrack et al.
2003/0227478 December 2003 Chatfield
2003/0229900 December 2003 Reisman
2003/0231208 December 2003 Hanon et al.
2003/0231871 December 2003 Ushimaru
2003/0235304 December 2003 Evans et al.
2004/0001106 January 2004 Deutscher et al.
2004/0001484 January 2004 Ozguner
2004/0001591 January 2004 Mani et al.
2004/0002938 January 2004 Deguchi
2004/0008852 January 2004 Also et al.
2004/0010727 January 2004 Fujinami
2004/0012620 January 2004 Buhler et al.
2004/0014426 January 2004 Moore
2004/0015252 January 2004 Aiso et al.
2004/0019497 January 2004 Volk et al.
2004/0019807 January 2004 Freund et al.
2004/0019911 January 2004 Gates et al.
2004/0023697 February 2004 Komura
2004/0024478 February 2004 Hans et al.
2004/0024925 February 2004 Cypher et al.
2004/0027166 February 2004 Mangum et al.
2004/0032348 February 2004 Lai et al.
2004/0032421 February 2004 Williamson et al.
2004/0032922 February 2004 Knapp et al.
2004/0037433 February 2004 Chen
2004/0041836 March 2004 Zaner et al.
2004/0042629 March 2004 Mellone et al.
2004/0044742 March 2004 Evron et al.
2004/0048569 March 2004 Kawamura
2004/0059842 March 2004 Hanson et al.
2004/0059965 March 2004 Marshall et al.
2004/0066736 April 2004 Kroeger
2004/0075767 April 2004 Neuman et al.
2004/0078383 April 2004 Mercer et al.
2004/0080671 April 2004 Siemens et al.
2004/0093096 May 2004 Huang et al.
2004/0098754 May 2004 Vella et al.
2004/0111473 June 2004 Lysenko et al.
2004/0117462 June 2004 Bodin et al.
2004/0117491 June 2004 Karaoguz et al.
2004/0117840 June 2004 Boudreau et al.
2004/0117858 June 2004 Boudreau et al.
2004/0128701 July 2004 Kaneko et al.
2004/0131192 July 2004 Metcalf
2004/0133689 July 2004 Vasisht
2004/0143368 July 2004 May et al.
2004/0143675 July 2004 Aust
2004/0143852 July 2004 Meyers
2004/0148237 July 2004 Bittmann et al.
2004/0168081 August 2004 Ladas et al.
2004/0170383 September 2004 Mazur
2004/0171346 September 2004 Lin
2004/0177167 September 2004 Iwamura et al.
2004/0179554 September 2004 Tsao
2004/0183827 September 2004 Putterman et al.
2004/0185773 September 2004 Gerber et al.
2004/0189363 September 2004 Takano
2004/0203378 October 2004 Powers
2004/0203590 October 2004 Shteyn
2004/0208158 October 2004 Fellman et al.
2004/0213230 October 2004 Douskalis et al.
2004/0223622 November 2004 Lindemann et al.
2004/0224638 November 2004 Fadell et al.
2004/0228367 November 2004 Mosig et al.
2004/0248601 December 2004 Chang
2004/0249490 December 2004 Sakai
2004/0249965 December 2004 Huggins et al.
2004/0249982 December 2004 Arnold et al.
2004/0252400 December 2004 Blank et al.
2004/0253969 December 2004 Nguyen et al.
2005/0010691 January 2005 Oyadomari et al.
2005/0011388 January 2005 Kouznetsov
2005/0013394 January 2005 Rausch et al.
2005/0015551 January 2005 Eames et al.
2005/0021590 January 2005 Debique et al.
2005/0027821 February 2005 Alexander et al.
2005/0047605 March 2005 Lee et al.
2005/0058149 March 2005 Howe
2005/0060435 March 2005 Xue et al.
2005/0062637 March 2005 El Zabadani et al.
2005/0081213 April 2005 Suzuoki et al.
2005/0102699 May 2005 Kim et al.
2005/0105052 May 2005 McCormick et al.
2005/0114538 May 2005 Rose
2005/0120128 June 2005 Willes et al.
2005/0125222 June 2005 Brown et al.
2005/0125357 June 2005 Saadat et al.
2005/0131558 June 2005 Braithwaite et al.
2005/0154766 July 2005 Huang et al.
2005/0159833 July 2005 Giaimo et al.
2005/0160270 July 2005 Goldberg et al.
2005/0166135 July 2005 Burke et al.
2005/0168630 August 2005 Yamada et al.
2005/0170781 August 2005 Jacobsen et al.
2005/0177643 August 2005 Xu
2005/0181348 August 2005 Carey et al.
2005/0195205 September 2005 Abrams
2005/0195823 September 2005 Chen et al.
2005/0197725 September 2005 Alexander et al.
2005/0198574 September 2005 Lamkin et al.
2005/0201549 September 2005 Dedieu et al.
2005/0215265 September 2005 Sharma
2005/0216556 September 2005 Manion et al.
2005/0239445 October 2005 Karaoguz et al.
2005/0246421 November 2005 Moore et al.
2005/0262217 November 2005 Nonaka et al.
2005/0281255 December 2005 Davies et al.
2005/0283820 December 2005 Richards et al.
2005/0288805 December 2005 Moore et al.
2005/0289224 December 2005 Deslippe et al.
2006/0041639 February 2006 Lamkin et al.
2006/0049966 March 2006 Ozawa et al.
2006/0072489 April 2006 Toyoshima
2006/0095516 May 2006 Wijeratne
2006/0098936 May 2006 Ikeda et al.
2006/0119497 June 2006 Miller et al.
2006/0142034 June 2006 Wentink et al.
2006/0143236 June 2006 Wu
2006/0155721 July 2006 Grunwald et al.
2006/0161742 July 2006 Sugimoto et al.
2006/0173844 August 2006 Zhang et al.
2006/0173976 August 2006 Vincent et al.
2006/0193454 August 2006 Abou-Chakra et al.
2006/0222186 October 2006 Paige et al.
2006/0227985 October 2006 Kawanami
2006/0259649 November 2006 Hsieh et al.
2006/0270395 November 2006 Dhawan et al.
2007/0003067 January 2007 Gierl et al.
2007/0022207 January 2007 Millington et al.
2007/0038999 February 2007 Millington et al.
2007/0043847 February 2007 Carter et al.
2007/0047712 March 2007 Gross et al.
2007/0048713 March 2007 Plastina et al.
2007/0054680 March 2007 Mo et al.
2007/0087686 April 2007 Holm et al.
2007/0142022 June 2007 Madonna et al.
2007/0142944 June 2007 Goldberg et al.
2007/0143493 June 2007 Mullig et al.
2007/0169115 July 2007 Ko et al.
2007/0180137 August 2007 Rajapakse
2007/0192156 August 2007 Gauger
2007/0249295 October 2007 Ukita et al.
2007/0265031 November 2007 Koizumi et al.
2007/0271388 November 2007 Bowra et al.
2007/0299778 December 2007 Haveson et al.
2008/0002836 January 2008 Moeller et al.
2008/0007649 January 2008 Bennett
2008/0007650 January 2008 Bennett
2008/0007651 January 2008 Bennett
2008/0018785 January 2008 Bennett
2008/0022320 January 2008 Ver Steeg
2008/0025535 January 2008 Rajapakse
2008/0060084 March 2008 Gappa et al.
2008/0072816 March 2008 Riess et al.
2008/0075295 March 2008 Mayman et al.
2008/0077619 March 2008 Gilley et al.
2008/0077620 March 2008 Gilley et al.
2008/0086318 April 2008 Gilley et al.
2008/0091771 April 2008 Allen et al.
2008/0120429 May 2008 Millington et al.
2008/0126943 May 2008 Parasnis et al.
2008/0144861 June 2008 Melanson et al.
2008/0144864 June 2008 Huon
2008/0146289 June 2008 Korneluk et al.
2008/0189272 August 2008 Powers et al.
2008/0205070 August 2008 Osada
2008/0212786 September 2008 Park
2008/0215169 September 2008 Debettencourt et al.
2008/0263010 October 2008 Roychoudhuri et al.
2008/0303947 December 2008 Ohnishi et al.
2009/0011798 January 2009 Yamada
2009/0017868 January 2009 Ueda et al.
2009/0031336 January 2009 Chavez et al.
2009/0062947 March 2009 Lydon et al.
2009/0070434 March 2009 Himmelstein
2009/0077610 March 2009 White et al.
2009/0089327 April 2009 Kalaboukis et al.
2009/0100189 April 2009 Bahren et al.
2009/0124289 May 2009 Nishida
2009/0157905 June 2009 Davis
2009/0164655 June 2009 Pettersson et al.
2009/0193345 July 2009 Wensley et al.
2009/0222115 September 2009 Malcolm et al.
2009/0222392 September 2009 Martin et al.
2009/0228919 September 2009 Zott et al.
2009/0251604 October 2009 Iyer
2010/0004983 January 2010 Dickerson et al.
2010/0031366 February 2010 Knight et al.
2010/0049835 February 2010 Ko et al.
2010/0087089 April 2010 Struthers et al.
2010/0228740 September 2010 Cannistraro et al.
2010/0284389 November 2010 Ramsay et al.
2010/0299639 November 2010 Ramsay et al.
2011/0001632 January 2011 Hohorst
2011/0002487 January 2011 Panther et al.
2011/0066943 March 2011 Brillon et al.
2011/0228944 September 2011 Croghan et al.
2011/0316768 December 2011 McRae
2012/0029671 February 2012 Millington et al.
2012/0030366 February 2012 Collart et al.
2012/0051567 March 2012 Castor-Perry
2012/0060046 March 2012 Millington
2012/0129446 May 2012 Ko et al.
2012/0148075 June 2012 Goh et al.
2012/0185771 July 2012 Rothkopf et al.
2012/0192071 July 2012 Millington
2012/0207290 August 2012 Moyers et al.
2012/0237054 September 2012 Eo et al.
2012/0281058 November 2012 Laney et al.
2012/0290621 November 2012 Heitz, III et al.
2013/0013757 January 2013 Millington et al.
2013/0018960 January 2013 Knysz et al.
2013/0031475 January 2013 Maor et al.
2013/0038726 February 2013 Kim
2013/0041954 February 2013 Kim et al.
2013/0047084 February 2013 Sanders et al.
2013/0052940 February 2013 Brillhart et al.
2013/0070093 March 2013 Rivera et al.
2013/0080599 March 2013 Ko et al.
2013/0094670 April 2013 Millington
2013/0124664 May 2013 Fonseca, Jr. et al.
2013/0129122 May 2013 Johnson et al.
2013/0132837 May 2013 Mead et al.
2013/0159126 June 2013 Elkady
2013/0167029 June 2013 Friesen et al.
2013/0174100 July 2013 Seymour et al.
2013/0174223 July 2013 Dykeman et al.
2013/0179163 July 2013 Herbig et al.
2013/0191454 July 2013 Oliver et al.
2013/0197682 August 2013 Millington
2013/0208911 August 2013 Millington
2013/0208921 August 2013 Millington
2013/0226323 August 2013 Millington
2013/0230175 September 2013 Bech et al.
2013/0232416 September 2013 Millington
2013/0253934 September 2013 Parekh et al.
2013/0279706 October 2013 Marti
2013/0287186 October 2013 Quady
2013/0290504 October 2013 Quady
2014/0006483 January 2014 Garmark et al.
2014/0037097 February 2014 Labosco
2014/0064501 March 2014 Olsen et al.
2014/0075308 March 2014 Sanders et al.
2014/0075311 March 2014 Boettcher et al.
2014/0079242 March 2014 Nguyen et al.
2014/0108929 April 2014 Garmark et al.
2014/0123005 May 2014 Forstall et al.
2014/0140530 May 2014 Gomes-Casseres et al.
2014/0161265 June 2014 Chaikin et al.
2014/0181569 June 2014 Millington et al.
2014/0233755 August 2014 Kim et al.
2014/0242913 August 2014 Pang
2014/0256260 September 2014 Ueda et al.
2014/0267148 September 2014 Luna et al.
2014/0270202 September 2014 Ivanov et al.
2014/0273859 September 2014 Luna et al.
2014/0279889 September 2014 Luna
2014/0285313 September 2014 Luna et al.
2014/0286496 September 2014 Luna et al.
2014/0298174 October 2014 Ikonomov
2014/0323036 October 2014 Daley et al.
2014/0344689 November 2014 Scott et al.
2014/0378056 December 2014 Liu
2015/0019670 January 2015 Redmann
2015/0026613 January 2015 Kwon et al.
2015/0032844 January 2015 Tarr et al.
2015/0043736 February 2015 Olsen et al.
2015/0049248 February 2015 Wang et al.
2015/0074527 March 2015 Sevigny et al.
2015/0074528 March 2015 Sakalowsky et al.
2015/0098576 April 2015 Sundaresan et al.
2015/0139210 May 2015 Marin et al.
2015/0256954 September 2015 Carlsson et al.
2015/0304288 October 2015 Balasaygun et al.
2015/0365987 December 2015 Weel
Foreign Patent Documents
2320451 Mar 2001 CA
1598767 Mar 2005 CN
101292500 Oct 2008 CN
0251584 Jan 1988 EP
0672985 Sep 1995 EP
0772374 May 1997 EP
1111527 Jun 2001 EP
1122931 Aug 2001 EP
1312188 May 2003 EP
1389853 Feb 2004 EP
2713281 Apr 2004 EP
1517464 Mar 2005 EP
0895427 Jan 2006 EP
1416687 Aug 2006 EP
1410686 Mar 2008 EP
2043381 Apr 2009 EP
2161950 Mar 2010 EP
0742674 Apr 2014 EP
2591617 Jun 2014 EP
2284327 May 1995 GB
2338374 Dec 1999 GB
2379533 Mar 2003 GB
2486183 Jun 2012 GB
63269633 Nov 1988 JP
07-210129 Aug 1995 JP
2000149391 May 2000 JP
2001034951 Feb 2001 JP
2002111817 Apr 2002 JP
2002123267 Apr 2002 JP
2002141915 May 2002 JP
2002358241 Dec 2002 JP
2003037585 Feb 2003 JP
2003506765 Feb 2003 JP
2003101958 Apr 2003 JP
2003169089 Jun 2003 JP
2005108427 Apr 2005 JP
2005136457 May 2005 JP
2007241652 Sep 2007 JP
2009506603 Feb 2009 JP
2009075540 Apr 2009 JP
2009135750 Jun 2009 JP
2009535708 Oct 2009 JP
2009538006 Oct 2009 JP
2011130496 Jun 2011 JP
439027 Jun 2001 TW
199525313 Sep 1995 WO
1999023560 May 1999 WO
199961985 Dec 1999 WO
0019693 Apr 2000 WO
0110125 Feb 2001 WO
200153994 Jul 2001 WO
02073851 Sep 2002 WO
03093950 Nov 2003 WO
2003093950 Nov 2003 WO
2005013047 Feb 2005 WO
2007023120 Mar 2007 WO
2007127485 Nov 2007 WO
2007131555 Nov 2007 WO
2007135581 Nov 2007 WO
2008082350 Jul 2008 WO
2008114389 Sep 2008 WO
2012050927 Apr 2012 WO
2014004182 Jan 2014 WO
2014149533 Sep 2014 WO

Other References

Allen Heath ML4000-User-Guide (Year: 2003). cited by examiner .
UPnP Design by Example, A Software Developers Guide to Universal Plug and Play Michael Jeronimo and JackWeast, Intel Press (D+M_0401307-818) (Apr. 2003) (511 pages). cited by applicant .
WANCommonInterfaceConfig:1 Service Template Version 1.01 for UPnP, Ver. 1.0 (Nov. 12, 2001) (D+M_0401820-43) (24 pages). cited by applicant .
WANIPConnection:1 Service Template Version 1.01 for UPnP Ver. 1 .0 (Nov. 12, 2001) (D+M_0401844-917) (74 pages). cited by applicant .
WANPPPConnection:1 Service Template Version 1.01 for UPnP, Version 1.0 (Nov. 12, 2001) (D+M_0401918-2006) (89 pages). cited by applicant .
Windows Media Connect Device Compatibility Specification (Apr. 12, 2004) (16 pages). cited by applicant .
"Symantec pcAnywhere User's Guide," v 10.5.1, 1995-2002, 154 pages. cited by applicant .
Rothermel et al., "Clock Hierarchies--An Abstraction for Grouping and Controlling Media Streams," University of Stuttgart Institute of Parallel and Distributed High-Performance Systems, Jan. 1996, 23 pages. cited by applicant .
Rothermel, Kurt, "State-of-the-Art and Future Research in Stream Synchronization," University of Stuttgart, 3 pages. cited by applicant .
Simple Network Time Protocol (SNTPI), RFC 1361 (Aug. 1992) (D+M_0397537-46) (10 pages). cited by applicant .
Simple Network Time Protocol (SNTPII), RFC 1769 (Mar. 1995) (D+M_0397663-76) (14 pages). cited by applicant .
Simple Service Discovery Protocol/1.0 Operating without an Arbiter (Oct. 28, 1999) (24 pages). cited by applicant .
Sonos, Inc. v D&M Holdings, D&M Supp Opposition Brief including Exhibits, Mar. 17, 2017, 23 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings, Expert Report of Jay P. Kesan including Appendices A-P, Feb. 20, 2017, 776 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit A: Defendants' First Amended Answer to Plaintiffs' Third Amended Complaint, provided Aug. 1, 2016, 26 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit A: Defendants' Second Amended Answer to Plaintiffs' Third Amended Complaint, provided Sep. 9, 2016, 88 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Sonos's Motion to Strike Defendants' New Amended Answer Submitted with their Reply Brief, provided Sep. 15, 2016, 10 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Sonos's Opposition to Defendants' Motion for Leave to Amend their Answer to Add the Defense of Inequitable Conduct, provided Oct. 31, 2016, 26 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. (No. 14-1330-RGA), Defendants' Final Invalidity Contentions (Jan. 18, 2017) (106 pages) cited by applicant .
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 226, Opinion Denying Inequitable Conduct Defenses, Feb. 6, 2017, updated, 5 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 242, US District Judge Andrews 101 Opinion, Mar. 13, 2017, 16 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings, Sonos Supp Opening Markman Brief including Exhibits, Mar. 3, 2017, 17 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings, Sonos Supp Reply Markman Brief including Exhibits, Mar. 29, 2017, 36 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 1: Defendants' Invalidity Contentions for U.S. Pat. No. 7,571,014 filed Sep. 16, 2016, 270 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 10: Defendants' Invalidity Contentions for U.S. Pat. No. 9,219,959 filed Sep. 27, 2016, 236 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 11: Defendants' Invalidity Contentions for U.S. Pat. No. D559,197 filed Sep. 27, 2016, 52 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 2: Defendants' Invalidity Contentions for U.S. Pat. No. 8,588,949 filed Sep. 27, 2016, 224 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 3: Defendants' Invalidity Contentions for U.S. Pat. No. 8,843,224 filed Sep. 27, 2016, 147 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 4: Defendants' Invalidity Contentions for U.S. Pat. No. 8,938,312 filed Sep. 27, 2016, 229 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 5: Defendants' Invalidity Contentions for U.S. Pat. No. 8,938,637 filed Sep. 27, 2016, 213 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 6: Defendants' Invalidity Contentions for U.S. Pat. No. 9,042,556 filed Sep. 27, 2016, 162 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 7: Defendants' Invalidity Contentions for U.S. Pat. No. 9,195,258 filed Sep. 27, 2016, 418 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 8: Defendants' Invalidity Contentions for U.S. Pat. No. 9,202,509 filed Sep. 27, 2016, 331 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 9: Defendants' Invalidity Contentions for U.S. Pat. No. 9,213,357 filed Sep. 27, 2016, 251 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Brief in Support of their Motion for Leave to Amend their Answer to Add the Defense of Inequitable Conduct, provided Oct. 12, 2016, 24 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Opposition to Sonos's Motion to Strike Defendants' New Amended Answer Submitted with their Reply, provided Oct. 3, 2016, 15 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit A: Defendants' Second Amended Answer to Plaintiffs' Third Amended Complaint, provided Oct. 12, 2016, 43 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit B: Defendants' Second Amended Answer to Plaintiffs' Third Amended Complaint, provided Oct. 12, 2016, 43 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Opening Brief in Support of Defendants' Motion for Leave to Amend Their Answer to Add the Defense of Inequitable Conduct, provided Aug. 1, 2016, 11 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Order, provided Oct. 7, 2016, 2 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Plaintiff's Opposition to Defendants' Motion for Leave to Amend Their Answer to Add the Defense of Inequitable Conduct, provided Aug. 26, 2016, 25 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Redlined Exhibit B: Defendants' First Amended Answer to Plaintiffs' Third Amended Complaint, provided Aug. 1, 2016, 27 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Reply Brief in Support of Defendants' Motion for Leave to Amend their Answer to Add the Defense of Inequitable Conduct, provided Nov. 10, 2016, 16 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Reply Brief in Support of Defendants' Motion for Leave to Amend their Answer to Add the Defense of Inequitable Conduct, provided Sep. 9, 2016, 16 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 206-1, Transcript of 101 Hearing (Nov. 28, 2016) (28 pages). cited by applicant .
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 207, Public Joint Claim Construction Brief (Nov. 30, 2016) (88 pages) cited by applicant .
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 214, D&M Post-Markman Letter (Dec. 22, 2016) (13 pages). cited by applicant .
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 215, Sonos Post-Markman Letter (Dec. 22, 2016) (15 pages). cited by applicant .
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 219, Claim Construction Opinion (Jan. 12, 2017) (24 pages). cited by applicant .
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 221, Claim Construction Order (Jan. 18, 2017) (2 pages). cited by applicant .
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), Markman Hearing Transcript (Dec. 14, 2016) (69 pages). cited by applicant .
Understanding Universal Plug and Play, Microsoft White Paper (Jun. 2000) (D+M_0402074-118) (45 pages). cited by applicant .
Universal Plug and Play Device Architecture V. 1.0, (Jun. 8, 2000) (54 pages). cited by applicant .
Universal Plug and Play in Windows XP, Tom Fout. Microsoft Corporation (Jul. 2001) (D+M_0402041-73) (33 pages). cited by applicant .
Universal Plug and Play ("UPnP") AV Architecture:1 for UPnP, Version 1.0, (Jun. 25, 2002) (D+M_0298151-72) (22 pages). cited by applicant .
Universal Plug and Play Vendor's Implementation Guide (Jan. 5, 2000) (7 pages). cited by applicant .
UPnP AV Architecture:0.83 (Jun. 12, 2002) (SONDM000115483-504) (22 pages). cited by applicant .
Advisory Action dated Dec. 28, 2016, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 4 pages. cited by applicant .
Chinese Office Action, Office Action dated Dec. 20, 2016, issued in connection with Chinese Application No. 201380044446.8, 16 pages. cited by applicant .
Dhir, Amit, "Wireless Home Networks--DECT, Bluetooth, Home RF, and Wirelss LANs," XILINX, wp135 (v1.0), Mar. 21, 2001, 18 pages. cited by applicant .
European Patent Office, Office Action dated Nov. 25, 2016, issued in connection with EP Application No. 13810340.3, 5 pages. cited by applicant .
Final Office Action dated Jan. 12, 2017, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 25 pages. cited by applicant .
Final Office Action dated Dec. 13, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 41 pages. cited by applicant .
Fout, Tom, "Universal Plug and Play (UPnP) Client Support," Microsoft, Aug. 2001, 18 pages. cited by applicant .
Japanese Patent Office, Final Office Action dated Nov. 8, 2016, issued in connection with Japanese Patent Application No. 2015-520286, 5 pages. cited by applicant .
Japanese Patent Office, Office Action dated Nov. 22, 2016, issued in connection with Japanese Application No. 2015-520288, 6 pages. cited by applicant .
Japanese Patent Office, Office Action dated Nov. 29, 2016, issued in connection with Japanese Application No. 2015-516169, 4 pages. cited by applicant .
Kou et al., "RenderingControl:1 Service Template Verion 1.01," Contributing Members of the UPnP Forum, Jun. 25, 2002, 63 pages. cited by applicant .
"Linux SDK for UPnP Devices vl.2," Intel Corporation, Jan. 17, 2003, 102 pages. cited by applicant .
Non-Final Office Action dated Jan. 3, 2017, issued in connection with U.S. Appl. No. 14/808,875, filed Jul. 24, 2015, 10 pages. cited by applicant .
Non-Final Office Action dated Jan. 12, 2017, issued in connection with U.S. Appl. No. 13/895,076, filed May 15, 2013, 10 pages. cited by applicant .
Non-Final Office Action dated Nov. 16, 2016, issued in connection with U.S. Appl. No. 15/228,639, filed Aug. 4, 2016, 15 pages. cited by applicant .
Non-Final Office Action dated Nov. 29, 2016, issued in connection with U.S. Appl. No. 13/894,179, filed May 14, 2013, 14 pages. cited by applicant .
Non-Final Office Action dated Nov. 30, 2016, issued in connection with U.S. Appl. No. 15/243,186, filed Aug. 22, 2016, 12 pages. cited by applicant .
Notice of Allowance dated Dec. 1, 2016, issued in connection with U.S. Appl. No. 15/088,283, filed Apr. 1, 2016, 9 pages. cited by applicant .
Notice of Allowance dated Dec. 2, 2016, issued in connection with U.S. Appl. No. 15/088,532, filed Apr. 1, 2016, 9 pages. cited by applicant .
Notice of Allowance dated Dec. 2, 2016, issued in connection with U.S. Appl. No. 15/088,678, filed Apr. 1, 2016, 9 pages. cited by applicant .
Notice of Allowance dated Dec. 2, 2016, issued in connection with U.S. Appl. No. 15/089,758, filed Apr. 4, 2016, 9 pages. cited by applicant .
Notice of Allowance dated Dec. 2, 2016, issued in connection with U.S. Appl. No. 15/155,149, filed May 16, 2016, 9 pages. cited by applicant .
Notice of Allowance dated Dec. 7, 2016, issued in connection with U.S. Appl. No. 15/156,392, filed May 17, 2016, 9 pages. cited by applicant .
Notice of Allowance dated Dec. 13, 2016, issued in connection with U.S. Appl. No. 15/080,591, filed Mar. 25, 2016, 9 pages. cited by applicant .
Notice of Allowance dated Dec. 14, 2016, issued in connection with U.S. Appl. No. 15/088,906, filed Apr. 1, 2016, 9 pages. cited by applicant .
Notice of Allowance dated Dec. 22, 2016, issued in connection with U.S. Appl. No. 15/080,716, filed Mar. 25, 2016, 9 pages. cited by applicant .
Pascoe, Bob, "Salutation Architectures and the newly defined service discovery protocols from Microsoft.RTM. and Sun.RTM.," Salutation Consortium, White Paper, Jun. 6, 1999, 5 pages. cited by applicant .
Ritchie et al., "MediaServer:1 Device Template Version 1.01," Contributing Members of the UPnP Forum, Jun. 25, 2002, 12 pages. cited by applicant .
Ritchie et al., "UPnP AV Architecture:1, Version 1.0," Contributing Members of the UPnP Forum, Jun. 25, 2002, 22 pages. cited by applicant .
Ritchie, John, "MediaRenderer:1 Device Template Version 1.01," Contributing Members of the UPnP Forum, Jun. 25, 2002, 12 pages. cited by applicant .
Schulzrinne et al., "RTP: A Transport Protocol for Real-Time Applications," Network Working Group, Jan. 1996, pp. 1-75. cited by applicant .
U.S. Appl. No. 60/490,768, filed Jul. 28, 2003, entitled "Method for synchronizing audio playback between multiple networked devices," 13 pages. cited by applicant .
U.S. Appl. No. 60/825,407, filed Sep. 12, 2003, entitled "Controlling and manipulating groupings in a multi-zone music or media system," 82 pages. cited by applicant .
"UPnP and Sonos Questions," Sonos Community, Dec. 2006, 5 pages. cited by applicant .
Yamaha DME 64 Owner's Manual; copyright 2004, 80 pages. cited by applicant .
Yamaha DME Designer 3.5 setup manual guide; copyright 2004, 16 pages. cited by applicant .
Yamaha DME Designer 3.5 User Manual; Copyright 2004, 507 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Opening Brief in Support of Defendants' Partial Motion for Judgment on the Pleadings for Lack of Patent-Eligible Subject Matter, filed May 6, 2016, 27 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Plaintiff Sonos, Inc.'s Opening Claim Construction Brief, filed Sep. 9, 2016, 26 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Plaintiff Sonos, Inc.'s Response in Opposition to Defendants' Partial Motion for Judgment on the Pleadings, filed May 27, 2016, 24 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Second Amended Complaint for Patent Infringement, filed Feb. 27, 2015, 49 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Third Amended Complaint for Patent Infringement, filed Jan. 29, 2016, 47 pages. cited by applicant .
Sony: AIR-SA 50R Wireless Speaker, Copyright 2009, 2 pages. cited by applicant .
Sony: Altus Quick Setup Guide ALT-SA32PC, Copyright 2009, 2 pages. cited by applicant .
Sony: BD/DVD Home Theatre System Operating Instructions for BDV-E300, E301 and E801, Copyright 2009, 115 pages. cited by applicant .
Sony: BD/DVD Home Theatre System Operating Instructions for BDV-IT1000/BDV-IS1000, Copyright 2008, 159 pages. cited by applicant .
Sony: Blu-ray Disc/DVD Home Theatre System Operating Instructions for BDV-IZ1000W, Copyright 2010, 88 pages. cited by applicant .
Sony: DVD Home Theatre System Operating Instructions for DAV-DZ380W/DZ680W/DZ880W, Copyright 2009, 136 pages. cited by applicant .
Sony: DVD Home Theatre System Operating Instructions for DAV-DZ870W, Copyright 2008, 128 pages. cited by applicant .
Sony Ericsson MS500 User Guide, Copyright 2009, 2 pages. cited by applicant .
Sony: Home Theatre System Operating Instructions for HT-1S100, Copyright 2008, 168 pages. cited by applicant .
Sony: HT-1S100, 5.1 Channel Audio System, last updated Nov. 2009, 2 pages. cited by applicant .
Sony: Multi Channel AV Receiver Operating Instructions, 2007, 80 pages. cited by applicant .
Sony: Multi Channel AV Receiver Operating Instructions for STR-DN1000, Copyright 2009, 136 pages. cited by applicant .
Sony: STR-DN1000, Audio Video Receiver, last updated Aug. 2009, 2 pages. cited by applicant .
Sony: Wireless Surround Kit Operating Instructions for WHAT-SA2, Copyright 2010, 56 pages. cited by applicant .
Taylor, Marilou, "Long Island Sound," Audio Video Interiors, Apr. 2000, 8 pages. cited by applicant .
TOA Corporation, Digital Processor DP-0206 DACsys2000 Version 2.00 Software Instruction Manual, Copyright 2001,67 pages. cited by applicant .
WaveLan High-Speed Multimode Chip Set, AVAGO0003, Agere Systems, Feb. 2003, 4 pages. cited by applicant .
WaveLan High-Speed Multimode Chip Set, AVAGO0005, Agere Systems, Feb. 2003, 4 pages. cited by applicant .
WaveLAN Wireless Integration Developer Kit (WI-DK) for Access Point Developers, AVAGO0054, Agere Systems, Jul. 2003, 2 pages. cited by applicant .
WaveLAN Wireless Integration-Developer Kit (WI-DK) Hardware Control Function (HCF), AVAGO0052, Agere Systems, Jul. 2003, 2 pages. cited by applicant .
WI-DK Release 2 WaveLan Embedded Drivers for VxWorks and Linux, AVAGO0056, Agere Systems, Jul. 2003, 2 pages. cited by applicant .
WI-DK Release 2 WaveLan END Reference Driver for VxWorks, AVAGO0044, Agere Systems, Jul. 2003, 4 pages. cited by applicant .
WI-DK Release 2 WaveLan LKM Reference Drivers for Linux, AVAGO0048, Agere Systems, Jul. 2003, 4 pages. cited by applicant .
WPA Reauthentication Rates, AVAGO0063, Agere Systems, Feb. 2004, 3 pages. cited by applicant .
"884+ Automatic Matrix Mixer Control System," Ivie Technologies, Inc., 2000, pp. 1-4. cited by applicant .
Barham et al., "Wide Area Audio Synchronisation", University of Cambridge Computer Laboratory, 1995, 5 pages. cited by applicant .
Brassil et al., "Enhancing Internet Streaming Media with Cueing Protocols", 2000, 9 pages. cited by applicant .
Cen et al., "A Distributed Real-Time MPEG Video Audio Player", Department of Computer Science and Engineering. Oregon Graduate Institute of Science and Technology, 1995, 12 pages. cited by applicant .
Chinese Office Action dated Jul. 5, 2016, issued in connection with Chinese Patent Application No. 201380044380.2, 25 pages. cited by applicant .
Dannenberg et al., "A. System Supporting Flexible Distributed Real-Time Music Processing", Proceedings of the 2001 International Computer Music Conference, 2001, 4 pages. cited by applicant .
Dannenberg; Roger B., "Remote Access to Interactive Media", Proceedings of the SPIE 1785, 1993, 230-237. cited by applicant .
Day, Rebecca, "Going Elan!" Primedia Inc., 2003, 4 pages. cited by applicant .
Fober et al., "Clock Skew Compensation over a High Latency Network," Proceedings of the ICMC, 2002, pp. 548-552. cited by applicant .
Ishibashi et al., "A Comparison of Media Synchronization Quality Among Reactive Control Schemes," IEEE Infocom, 2001, pp. 77-84. cited by applicant .
"A/V Surround Receiver AVR-5800," Denon Electronics, 2000, 2 pages. cited by applicant .
"A/V System Controleer, Owner's Manual," B&K Compontents, Ltd., 1998, 52 pages. cited by applicant .
"DP-0206 Digital Signal Processor," TOA Electronics, Inc., 2001, pp. 1-12. cited by applicant .
"Home Theater Control Systems," Cinema Source, 2002, 19 pages. cited by applicant .
"Model MRC44 Four Zone--Four Source Audio/Video Controller/Amplifier System," Xantech Corporation, 2002, 52 pages. cited by applicant .
"NexSys Software v. 3 Manual," Crest Audio, Inc., 1997, 76 pages. cited by applicant .
"Residential Distributed Audio Wiring Practices," Leviton Network Solutions, 2001, 13 pages. cited by applicant .
"RVL-6 Modular Multi-Room Controller, Installation & Operation Guide," Nile Audio Corporations, 1999, 46 pages. cited by applicant .
"Systemline Modular Installation Guide, Multiroom System," Systemline, 2003, pp. 1-22. cited by applicant .
"ZR-8630AV MultiZone Audio/Video Receiver, Installation and Operation Guide," Niles Audio Corporation, 2003, 86 pages. cited by applicant .
ZX135: Installation Manual, LA Audio, Apr. 2003, 44 pages. cited by applicant .
Lienhart et al., "On the Importance of Exact Synchronization for Distributed Audio Signal Processing", Session L: Poster Session II--ICASSP'03 Papers, 2002, 1 page. cited by applicant .
Liu et al., "A synchronization control scheme for real-time streaming multimedia applications" Packet Video. 2003, 10 pages, vol. 2003. cited by applicant .
Liu et al., "Adaptive Delay Concealment for Internet Voice Applications with Packet-Based Time-Scale Modification." Information Technologies 2000, pp. 91-102. cited by applicant .
Non-Final Office Action dated Sep. 7, 2016, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 12 pages. cited by applicant .
Non-Final Office Action dated Aug. 9, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 31 pages. cited by applicant .
Notice of Allowance dated Aug. 30, 2016, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 7 pages. cited by applicant .
Pillai et al., "A Method to Improve the Robustness of MPEG Video Applications over Wireless Networks", Kent Ridge Digital Labs, 2000, 15 pages. cited by applicant .
Rangan et al., "Feedback Techniques for Continuity and Synchronization in Multimedia Information Retrieval", ACM Transactions on Information Systems, 1995, 13(2), 145-176. cited by applicant .
Reid, Mark, "Multimedia conferencing over ISDN and IP networks using ITU-T H-series recommendations: architecture, control and coordination," Computer Networks, 1999, vol. 31, pp. 225-235. cited by applicant .
Rothermel et al., "An Adaptive Protocol for Synchronizing Media Streams", Institute of Parallel and Distributed High-Performance Systems (IPVR), 1997, 26 pages. cited by applicant .
Rothermel et al., "An Adaptive Stream Synchronization Protocol," 5th International Workshop on Network and Operating System Support for Digital Audio and Video, Apr. 18-21, 1995, 12 pages. cited by applicant .
Rothermel et al., "Synchronization in Joint-Viewing Environments", University of Stuttgart Institute of Parallel and Distributed High-Performance Systems, 1992, 13 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 1: Defendants' Invalidity Contentions for U.S. Pat. No. 7,571,014 filed Apr. 15, 2016, 161 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 10: Defendants' Invalidity Contentions for U.S. Pat. No. 9,213,357 filed Apr. 15, 2016, 244 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 2: Defendants' Invalidity Contentions for U.S. Pat. No. 8,588,949 filed Apr. 15, 2016, 112 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 5: Defendants' Invalidity Contentions for U.S. Pat. No. 8,938,637 filed Apr. 15, 2016, 177 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 8: Defendants' Invalidity Contentions for U.S. Pat. No. 9,195,258 filed Apr. 15, 2016, 400 pages. cited by applicant .
Maniactools, "Identify Duplicate Files by Sound," Sep. 28, 2010, http://www.maniactools.com/soft/music-duplicate-remover/identify-duplicat- e-files-by-sound.shtml. cited by applicant .
Mills David L., "Network Time Protocol (Version 3) Specification, Implementation and Analysis," Network Working Group, Mar. 1992, 7 pages. cited by applicant .
Mills, David L., "Precision Synchronization of Computer Network Clocks," ACM SIGCOMM Computer Communication Review, 1994, pp. 28-43, vol. 24, No. 2. cited by applicant .
Motorola., "Simplefi, Wireless Digital Audio Receiver, Installation and User Guide", Dec. 31, 2001, 111 pages. cited by applicant .
Nilsson, M., "ID3 Tag Version 2", Mar. 26, 1998, 28 pages. cited by applicant .
Non-Final Office Action dated May 1, 2014, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 31 pages. cited by applicant .
Non-Final Office Action dated Dec. 5, 2013, issued in connection with U.S. Appl. No. 13/827,653, filed Mar. 14, 2013, 28 pages. cited by applicant .
Non-Final Office Action dated Jan. 5, 2012, issued in connection with U.S. Appl. No. 13/298,090, filed Nov. 16, 2011, 35 pages. cited by applicant .
Non-Final Office Action dated May 6, 2014, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 23 pages. cited by applicant .
Non-final Office Action dated Apr. 10, 2013, issued in connection with U.S. Appl. No. 13/619,237, filed Sep. 14, 2012, 10 pages. cited by applicant .
Non-Final Office Action dated May 12, 2014, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 23 pages. cited by applicant .
"Non-Final Office Action dated May 14, 2014, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 14 pages". cited by applicant .
"Non-Final Office Action dated Jun. 17, 2014, issued in connection with U.S. Appl. No. 14/176,808, filed Feb. 10, 2014, 6 pages". cited by applicant .
"Non-Final Office Action dated Dec. 18, 2013, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 12 pages". cited by applicant .
"Non-Final Office Action dated Jan. 18, 2008, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 28 pages". cited by applicant .
"Non-Final Office Action dated Apr. 19, 2010, issued in connection with U.S. Appl. No. 11/801,468, filed May 9, 2007, 16 pages". cited by applicant .
"Non-Final Office Action dated Mar. 19, 2013, issued in connection with U.S. Appl. No. 13/724,048, filed Dec. 21, 2012, 9 pages". cited by applicant .
"Non-Final Office Action dated Jun. 21, 2011, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 13 pages". cited by applicant .
"Non-Final Office Action dated Jan. 22, 2009, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 18 pages". cited by applicant .
"Non-Final Office Action dated Jul. 25, 2014, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 9 pages". cited by applicant .
"Non-Final Office Action dated Jul. 25, 2014, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 11 pages". cited by applicant .
"Non-Final Office Action dated Jun. 25, 2010, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 17 pages". cited by applicant .
"Non-Final Office Action dated Nov. 25, 2013, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 19 pages". cited by applicant .
"Non-Final Office Action dated May 27, 2014, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 13 pages". cited by applicant .
Non-Final Office Action dated Feb. 29, 2012, issued in connection with U.S. Appl. No. 13/297,000, filed Nov. 15, 2011, 10 pages. cited by applicant .
"Non-Final Office Action dated Nov. 29, 2010, issued in connection with U.S. Appl. No. 11/801,468, filed May 9, 2007, 17 pages". cited by applicant .
Non-Final Office Action dated Jul. 30, 2013 issued in connection with U.S. Appl. No. 13/724,048, filed Dec. 21, 2012, 7 pages. cited by applicant .
Non-Final Office Action dated Jul. 31, 2014, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 31 pages. cited by applicant .
"Non-Final Office Action dated Dec. 1, 2014, issued in connection with U.S. Appl. No. 14/516,867, filed Oct. 17, 2014, 11 pages". cited by applicant .
"Non-Final Office Action dated Jun. 3, 2015, issued in connection with U.S. Appl. No. 14/564,544, filed Dec. 9, 2014, 7 pages.". cited by applicant .
"Non-Final Office Action dated Jun. 4, 2015, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 16 pages.". cited by applicant .
"Non-Final Office Action dated Mar. 4, 2015, issued in connection with U.S. Appl. No. 13/435,776, filed Mar. 30, 2012, 16 pages.". cited by applicant .
Non-Final Office Action dated Mar. 8, 2016, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 13 pages. cited by applicant .
"Non-Final Office Action dated Mar. 10, 2011, issued in connection with U.S. Appl. No. 12/035,112, filed Feb. 21, 2008, 12 pages.". cited by applicant .
"Non-Final Office Action dated Jun. 12, 2015, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 16 pages". cited by applicant .
Non-Final Office Action dated Mar. 12, 2015, issued in connection with U.S. Appl. No. 13/705,174, filed Dec. 5, 2012, 13 pages. cited by applicant .
Non-Final Office Action dated Jan. 13, 2016, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 14 pages. cited by applicant .
"Non-Final Office Action dated Mar. 13, 2015, issued in connection with U.S. Appl. No. 13/705,177, filed Dec. 5, 2012, 15 pages.". cited by applicant .
"Non-Final Office Action dated Nov. 17, 2014, issued in connection with U.S. Appl. No. 13/864,247, filed Apr. 17, 2013, 11 pages". cited by applicant .
"Non-Final Office Action dated Feb. 18, 2009, issued in connection with U.S. Appl. No. 10/861,653, filed Jun. 5, 2004, 18 pages.". cited by applicant .
"Non-Final Office Action dated Nov. 18, 2014, issued in connection with U.S. Appl. No. 13/435,739, filed Mar. 30, 2012, 10 pages". cited by applicant .
"Non-Final Office Action dated Jun. 19, 2015, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 38 pages.". cited by applicant .
"Non-Final Office Action dated Nov. 19, 2014, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 9 pages". cited by applicant .
Non-Final Office Action dated Aug. 20, 2009, issued in connection with U.S. Appl. No. 11/906,702, filed Oct. 2, 2007, 27 pages. cited by applicant .
Non-Final Office Action dated Jun. 23, 2015, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 30 pages. cited by applicant .
"Non-Final Office Action dated Oct. 23, 2014, issued in connection with U.S. Appl. No. 13/848,904, filed Mar. 22, 2013, 11 pages". cited by applicant .
"Non-Final Office Action dated Oct. 23, 2014, issued in connection with U.S. Appl. No. 13/864,251, filed Apr. 17, 2013, 11 pages". cited by applicant .
"Non-Final Office Action dated Oct. 23, 2014, issued in connection with U.S. Appl. No. 13/888,203, filed May 6, 2013, 9 pages". cited by applicant .
"Non-final Office Action dated Oct. 24, 2014, issued in connection with U.S. Appl. No. 13/435,776, filed Mar. 30, 2012, 14 pages". cited by applicant .
"Non-Final Office Action dated Feb. 26, 2015, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 25 pages.". cited by applicant .
Non-Final Office Action dated Mar. 26, 2015, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 18 pages. cited by applicant .
Non-Final Office Action dated Jun. 27, 2008, issued in connection with U.S. Appl. No. 10/861,653, filed Jun. 5, 2004, 19 pages. cited by applicant .
Non-Final Office Action dated Mar. 27, 2015, issued in connection with U.S. Appl. No. 13/705,178, filed Dec. 5, 2012, 14 pages. cited by applicant .
Non-Final Office Action dated Dec. 28, 2015, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 29 pages. cited by applicant .
Non-Final Office Action dated Apr. 30, 2012, issued in connection with U.S. Appl. No. 13/204,511, filed Aug. 5, 2011, 16 pages. cited by applicant .
Non-Final Office Action dated Jan. 30, 2015, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 13 pages. cited by applicant .
Non-Final Office Action dated Jan. 30, 2015, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 30 pages. cited by applicant .
North American MPEG-2 Information, "The MPEG-2 Transport Stream", Retrieved from the Internet:, 2006, pp. 1-5. cited by applicant .
Notice of Allowance dated Jan. 31, 2013, issued in connection with U.S. Appl. No. 13/298,090, filed Nov. 16, 2011, 19 pages. cited by applicant .
Notice of Allowance dated Jul. 2, 2015, issued in connection with U.S. Appl. No. 13/848,904, filed Mar. 22, 2013, 17 pages. cited by applicant .
Notice of Allowance dated Jul. 2, 2015, issued in connection with U.S. Appl. No. 13/888,203, filed May 6, 2013, 19 pages. cited by applicant .
Notice of Allowance dated Jul. 2, 2015, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 23 pages. cited by applicant .
Notice of Allowance dated Sep. 3, 2015, issued in connection with U.S. Appl. No. 13/705,174, filed Dec. 5, 2012, 4 pages. cited by applicant .
Notice of Allowance dated Aug. 4, 2015, issued in connection with U.S. Appl. No. 14/516,867, filed Oct. 17, 2014, 13 pages. cited by applicant .
Notice of Allowance dated Oct. 5, 2012, issued in connection with U.S. Appl. No. 13/204,511, filed Aug. 5, 2011, 11 pages. cited by applicant .
Notice of Allowance dated Mar. 6, 2014, issued in connection with U.S. Appl. No. 13/827,653, filed Mar. 14, 2013, 17 pages. cited by applicant .
Notice of Allowance dated May 6, 2011, issued in connection with U.S. Appl. No. 11/801,468, filed May 9, 2007, 10 pages. cited by applicant .
Notice of Allowance dated Sep. 6, 2013, issued in connection with U.S. Appl. No. 13/619,237, filed Sep. 14, 2012, 10 pages. cited by applicant .
Notice of Allowance dated Apr. 7, 2016, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 40 pages. cited by applicant .
Notice of Allowance dated Oct. 7, 2015, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 7 pages. cited by applicant .
Notice of Allowance dated Oct. 9, 2015, issued in connection with U.S. Appl. No. 13/435,739, filed Mar. 30, 2012, 4 pages. cited by applicant .
Notice of Allowance dated Aug. 10, 2015, issued in connection with U.S. Appl. No. 13/848,904, filed Mar. 22, 2013, 9 pages. cited by applicant .
Notice of Allowance dated Nov. 10, 2011, issued in connection with U.S. Appl. No. 11/906,702, filed Oct. 2, 2007, 17 pages. cited by applicant .
Notice of Allowance dated Apr. 11, 2016, issued in connection with U.S. Appl. No. 13/864,247, filed Apr. 17, 2013, 21 pages. cited by applicant .
Notice of Allowance dated Jan. 11, 2016, issued in connection with U.S. Appl. No. 14/564,544, filed Dec. 9, 2014, 5 pages. cited by applicant .
Notice of Allowance dated Aug. 12, 2015, issued in connection with U.S. Appl. No. 13/435,739, filed Mar. 30, 2012, 27 pages. cited by applicant .
Notice of Allowance dated Jul. 13, 2015, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 22 pages. cited by applicant .
Notice of Allowance dated Nov. 13, 2013, issued in connection with U.S. Appl. No. 13/724,048, filed Dec. 21, 2012, 7 pages. cited by applicant .
Notice of Allowance dated Oct. 13, 2015, issued in connection with U.S. Appl. No. 13/864,251, filed Apr. 17, 2013, 7 pages. cited by applicant .
Notice of Allowance dated Jun. 14, 2012, issued in connection with U.S. Appl. No. 12/035,112, filed Feb. 21, 2008, 9 pages. cited by applicant .
Notice of Allowance dated Jul. 15, 2015, issued in connection with U.S. Appl. No. 13/705,174, filed Dec. 5, 2012, 18 pages. cited by applicant .
Notice of Allowance dated Jun. 16, 2009, issued in connection with U.S. Appl. No. 10/861,653, filed Jun. 5, 2004, 11 pages. cited by applicant .
Notice of Allowance dated Jul. 17, 2015, issued in connection with U.S. Appl. No. 13/864,251, filed Apr. 17, 2013, 20 pages. cited by applicant .
Notice of Allowance dated May 19, 2015, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 7 pages. cited by applicant .
Notice of Allowance dated Sep. 21, 2015, issued in connection with U.S. Appl. No. 13/297,000, filed Nov. 15, 2011, 11 pages. cited by applicant .
Notice of Allowance dated Sep. 22, 2015, issued in connection with U.S. Appl. No. 13/888,203, filed May 6, 2013, 7 pages. cited by applicant .
Notice of Allowance dated Sep. 24, 2015, issued in connection with U.S. Appl. No. 13/705,174, filed Dec. 5, 2012, 7 pages. cited by applicant .
Notice of Allowance dated Sep. 24, 2015, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 7 pages. cited by applicant .
Notice of Allowance dated Sep. 25, 2014, issued in connection with U.S. Appl. No. 14/176,808, filed Feb. 10, 2014, 5 pages. cited by applicant .
Notice of Allowance dated Aug. 27, 2015, issued in connection with U.S. Appl. No. 13/705,177, filed Dec. 5, 2012, 34 pages. cited by applicant .
Notice of Allowance dated Aug. 27, 2015, issued in connection with U.S. Appl. No. 14/505,027, filed Oct. 2, 2014, 18 pages. cited by applicant .
Notice of Allowance dated Dec. 27, 2011, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 15 pages. cited by applicant .
Notice of Allowance dated Jul. 29, 2015, issued in connection with U.S. Appl. No. 13/359,976, filed Jan. 27, 2012, 28 pages. cited by applicant .
Notice of Allowance dated Jul. 29, 2015, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 9 pages. cited by applicant .
Notice of Allowance dated Jul. 30, 2015, issued in connection with U.S. Appl. No. 13/705,178, filed Dec. 5, 2012, 18 pages. cited by applicant .
Notice of Allowance dated Aug. 5, 2015, issued in connection with U.S. Appl. No. 13/435,776, filed Mar. 30, 2012, 26 pages. cited by applicant .
Notice of Allowance dated Jul. 6, 2015, issued in connection with U.S. Appl. No. 13/297,000, filed Nov. 15, 2011, 24 pages. cited by applicant .
Nutzel et al., "Sharing Systems for Future HiFi Systems", IEEE, 2004, 9 pages. cited by applicant .
Palm, Inc., "Handbook for the Palm VII Handheld," May 2000, 311 pages. cited by applicant .
Park et al., "Group Synchronization in MultiCast Media Communications," Proceedings of the 5th Research on Multicast Technology Workshop, 2003, 5 pages. cited by applicant .
Polycom Conference Composer manual: copyright 2001. cited by applicant .
Pre-Interview First Office Action dated Mar. 10, 2015, issued in connection with U.S. Appl. No. 14/505,027, filed Oct. 2, 2014, 4 pages. cited by applicant .
Presentations at WinHEC 2000, May 2000, 138 pages. cited by applicant .
PRISMIQ; Inc., "PRISMIQ Media Player User Guide", 2003, 44 pages. cited by applicant .
Re-Exam Final Office Action dated Aug. 5, 2015, issued in connection with U.S. Appl. No. 90/013,423, filed Jan. 5, 2015, 25 pages. cited by applicant .
Re-Exam Non-Final Office Action dated Apr. 22, 2015, issued in connection with U.S. Appl. No. 90/013,423, filed Jan. 5, 2015, 16 pages. cited by applicant .
Renkus Heinz Manual; available for sale at least 2004, 6 pages. cited by applicant .
Roland Corporation, "Roland announces BA-55 Portable PA System," press release, Apr. 6, 2011, 2 pages. cited by applicant .
Rothermel et al., "An Adaptive Stream Synchronization Protocol," 5th International Workshop on Network and Operating System Support for Digital Audio and Video, 1995, 13 pages. cited by applicant .
Schmandt et al., "Impromptu: Managing Networked Audio Applications for Mobile Users", 2004, 11 pages. cited by applicant .
Schulzrinne H., et al., "RTP: A Transport Protocol for Real-Time Applications, RFC 3550," Network Working Group, 2003, pp. 1-89. cited by applicant .
UPnP; "Universal Plug and Play Device Architecture," Jun. 8, 2000; version 1.0; Microsoft Corporation; pp. 1-54. cited by applicant .
"Welcome. You're watching Apple TV." Apple TV 1st Generation Setup Guide, Apr. 8, 2008 Retrieved Oct. 14, 2014, 40 pages. cited by applicant .
"Welcome. You're watching Apple TV." Apple TV 2nd Generation Setup Guide, Mar. 10, 2011 Retrieved Oct. 16, 2014, 35 pages. cited by applicant .
"Welcome. You're watching Apple TV." Apple TV 3rd Generation Setup Guide, Mar. 16, 2012 Retrieved Oct. 16, 2014, 35 pages. cited by applicant .
Yamaha DME 32 manual: copyright 2001. cited by applicant .
Yamaha DME Designer software manual: Copyright 2004, 482 pages. cited by applicant .
Advanced Driver Tab User Interface WaveLan GUI Guide, AVAGO0009, Agere Systems, Feb. 2004, 4 pages. cited by applicant .
Agere Systems' Voice-over-Wireless LAN (VoWLAN) Station Quality of Service, AVAGO0015, Agere Systems, Jan. 2005, 5 pages. cited by applicant .
Akyildiz et al., "Multimedia Group Synchronization Protocols for Integrated Services Networks," IEEE Journal on Selected Areas in Communications, 1996 pp. 162-173, vol. 14, No. 1. cited by applicant .
Audio Authority: How to Install and Use the Model 1154 Signal Sensing Auto Selector, 2002, 4 pages. cited by applicant .
Audio Authority: Model 1154B High Definition AV Auto Selector, 2008, 8 pages. cited by applicant .
AudioSource: AMP 100 User Manual, 2003, 4 pages. cited by applicant .
Automatic Profile Hunting Functional Description, AVAGO0013, Agere Systems, Feb. 2004, 2 pages. cited by applicant .
AXIS Communication: AXIS P8221 Network I/O Audio Module, 2009, 41 pages. cited by applicant .
Balfanz et al., "Network-in-a-Box: How to Set Up a Secure Wireless Network in Under a Minute", 13th USENIX Security Symposium--Technical Paper, 2002, 23 pages. cited by applicant .
Balfanz et al., "Talking to Strangers: Authentication in Ad-Hoc Wireless Networks", Xerox Palo Alto Research Center, 2002, 13 pages. cited by applicant .
Bogen Communications, Inc., ProMatrix Digitally Matrixed Amplifier Model PM3180, Copyright 1996, 2 pages. cited by applicant .
Breebaart et al., "Multi-Channel Goes Mobile: MPEG Surround Binaural Rendering", AES 29th International Conference, 2006, Sep. 2-4, 1-13. cited by applicant .
Change Notification: Agere Systems WaveLan Multimode Reference Design (D2 to D3), AVAGO0042, Agere Systems, Nov. 2004, 2 pages. cited by applicant .
Deep-Sleep Implementation in WL60011 for IEEE 802.11b Applications, AVAGO0020, Agere Systems, Jul. 2004, 22 pages. cited by applicant .
Denon AV Surround Receiver AVR-1604/684 User's Manual, 2004, 128 pages. cited by applicant .
Denon AV Surround Receiver AVR-5800 Operating Instructions, Copyright 2000,67 pages. cited by applicant .
Faller, Christof, "Coding of Spatial Audio Compatible with Different Playback Formats", Audio Engineering Society Convention Paper (Presented at the 117th Convention), 2004, Oct. 28-31, 2004, 12 pages. cited by applicant .
Fireball DVD and Music Manager DVDM-100 Installation and User's Guide, Copyright 2003, 185 pages. cited by applicant .
Fireball MP-200 User's Manual, Copyright 2006, 93 pages. cited by applicant .
Fireball Remote Control Guide WD006-1-1, Copyright 2003, 19 pages. cited by applicant .
Fireball SE-D1 User's Manual, Copyright 2005, 90 pages. cited by applicant .
Gaston et al., "Methods for Sharing Stereo and Multichannel Recordings Among Planetariums", Audio Engineering Society Convention Paper 7474, 2008, 15 pages. cited by applicant .
Herre et al., "The Reference Model Architecture for MPEG Spatial Audio Coding", Audio Engineering Society Convention Paper (Presented at the 118th Convention), May 28-31, 2005, 13 pages. cited by applicant .
IBM Home Director Installation and Service Manual, Copyright 1998, 124 pages. cited by applicant .
IBM Home Director Owner's Manual, Copyright 1999, 67 pages. cited by applicant .
Integra Audio Network Receiver NAC 2.3 Instruction Manual, 68 pages. cited by applicant .
Integra Audio Network Server NAS 2.3 Instruction Manual, pp. 1-32. cited by applicant .
Integra Service Manual, Audio Network Receiver Model NAC-2.3, Dec. 2002, 44 pages. cited by applicant .
Issues with Mixed IEEE 802.b/802.11g Networks, AVAGO0058, Agere Systems, Feb. 2004, 5 pages. cited by applicant .
Lake Processors: Lake.RTM. LM Series Digital Audio Processors Operation Manual, 2011, 71 pages. cited by applicant .
"Denon 2003-2004 Product Catalog," Denon, 2003-2004, 44 pages. cited by applicant .
LG: RJP-201M Remote Jack Pack Installation and Setup Guide, 2010, 24 pages. cited by applicant .
LinkSys by Cisco, Wireless Home Audio Controller, Wireless-N Touchscreen Remote DMRW1000 Datasheet, Copyright 2008,2 pages. cited by applicant .
LinkSys by Cisco, Wireless Home Audio Controller, Wireless-N Touchscreen Remote DMRW1000 User Guide, Copyright 2008, 64 pages. cited by applicant .
LinkSys by Cisco, Wireless Home Audio Player, Wireless-N Music Extender DMP100 Quick Installation Guide, Copyright 2009, 32 pages. cited by applicant .
LinkSys by Cisco, Wireless Home Audio Player, Wireless-N Music Extender DMP100 User Guide, Copyright 2008,65 pages. cited by applicant .
Liu et al., "Adaptive Delay Concealment for Internet Voice Applications with Packet-Based Time-Scale Modification." Information Technologies 2000, 4 pages. cited by applicant .
Parasound Zpre2 Zone Preamplifier with PTZI Remote Control, 2005, 16 pages. cited by applicant .
Proficient Audio Systems M6 Quick Start Guide, 2011, 5 pages. cited by applicant .
Proficient Audio Systems: Proficient Editor Advanced Programming Guide, 2007, 40 pages. cited by applicant .
Programming Interface for WL54040 Dual-Band Wireless Transceiver, AVAGO0066, Agere Systems, May 2004, 16 pages. cited by applicant .
Radio Shack, "Auto-Sensing 4-Way Audio/Video Selector Switch", 2004, 1 page. cited by applicant .
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 1, 100 pages. cited by applicant .
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 2, 100 pages. cited by applicant .
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 3, 100 pages. cited by applicant .
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 4, 100 pages. cited by applicant .
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 5, 46 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Complaint for Patent Infringement, filed Oct. 21, 2014, 20 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Declaration of Steven C. Visser, executed Sep. 9, 2016, 40 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions, filed Sep. 14, 2016, 100 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 11: Defendants' Invalidity Contentions for U.S. Pat. No. 9,219,959 filed Apr. 15, 2016, 172 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 12: Defendants' Invalidity Contentions for U.S. Design Patent No. D559,197 filed Apr. 15, 2016, 36 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 3: Defendants' Invalidity Contentions for U.S. Pat. No. 8,843,224 filed Apr. 15, 2016, 118 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 4: Defendants' Invalidity Contentions for U.S. Pat. No. 8,938,312 filed Apr. 15, 2016, 217 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 6: Defendants' Invalidity Contentions for U.S. Pat. No. 9,042,556 filed Apr. 15, 2016, 86 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 7: Defendants' Invalidity Contentions for U.S. Pat. No. 9,130,771 filed Apr. 15, 2016, 203 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 9: Defendants' Invalidity Contentions for U.S. Pat. No. 9,202,509 filed Apr. 15, 2016, 163 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions, filed Apr. 15, 2016, 97 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Preliminary Identification of Indefinite Terms, provided Jul. 29, 2016, 8 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Preliminary Identification of Prior Art References, provided Jul. 29, 2016, 5 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Amended Answer, Defenses, and Counterclaims for Patent Infringement, filed Nov. 30, 2015, 47 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Answer to Plaintiff's Second Amended Complaint, filed Apr. 30, 2015, 19 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' First Amended Answer to Plaintiffs' Third Amended Complaint, filed Sep. 7, 2016, 23 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Reply in Support of Partial Motion for Judgment on the Pleadings, filed Jun. 10, 2016, 15 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit A: Defendants' Second Amended Answer to Plaintiffs' Third Amended Complaint, filed Sep. 9, 2016, 43 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., First Amended Complaint for Patent Infringement, filed Dec. 17, 2014, 26 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Joint Claim Construction Chart, vol. 1 of 3 with Exhibits A-O, filed Aug. 17, 2016, 30 pages. cited by applicant .
Advisory Action dated Jun. 9, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 25, 2013, 14 pages. cited by applicant .
Final Office Action dated Jun. 3, 2016, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 24 pages. cited by applicant .
Final Office Action dated May 25, 2016, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 33 pages. cited by applicant .
Japanese Patent Office, Office Action dated May 24, 2016, issued in connection with Japanese Patent Application No. 2014-220704, 7 pages. cited by applicant .
Non-Final Office Action dated Jun. 1, 2016, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 21 pages. cited by applicant .
Non-Final Office Action dated May 10, 2016, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 22 pages. cited by applicant .
Advisory Action dated Feb. 2, 2016, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 8 pages. cited by applicant .
Advisory Action dated Sep. 18, 2008, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 8 pages. cited by applicant .
Advisory Action dated Feb. 1, 2016, issued in connection with U.S. Appl. No. 13/864,247, filed Apr. 17, 2013, 6 pages. cited by applicant .
Advisory Action dated Jun. 1, 2015, issued in connection with U.S. Appl. No. 14/516,867, filed Oct. 17, 2014, 11 pages. cited by applicant .
Advisory Action dated Mar. 2, 2015, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 3 pages. cited by applicant .
Advisory Action dated Jan. 5, 2012, issued in connection with U.S. Appl. No. 12/035,112, filed Feb. 21, 2008, 3 pages. cited by applicant .
Advisory Action dated Sep. 5, 2014, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 3 pages. cited by applicant .
Advisory Action dated Jan. 8, 2015, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 4 pages. cited by applicant .
Advisory Action dated Feb. 10, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 3 pages. cited by applicant .
Advisory Action dated Nov. 12, 2014, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 6 pages. cited by applicant .
Advisory Action dated Apr. 15, 2015, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 9 pages. cited by applicant .
Advisory Action dated Apr. 15, 2015, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 9 pages. cited by applicant .
Advisory Action dated Mar. 25, 2015, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 5 pages. cited by applicant .
Advisory Action dated Feb. 26, 2015, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 3 pages. cited by applicant .
Advisory Action dated Nov. 26, 2014, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 9 pages. cited by applicant .
Advisory Action dated Jul. 28, 2015, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 7 pages. cited by applicant .
Advisory Action dated Sep. 28, 2009, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 4 pages. cited by applicant .
"AudioTron Quick Start Guide, Version 1.0", Voyetra Turtle Beach, Inc., Mar. 2001, 24 pages. cited by applicant .
"AudioTron Reference Manual, Version 3.0", Voyetra Turtle Beach, Inc., May 2002, 70 pages. cited by applicant .
"AudioTron Setup Guide, Version 3.0", Voyetra Turtle Beach, Inc., May 2002, 38 pages. cited by applicant .
Baldwin, Roberto. "How-To: Setup iTunes DJ on Your Max and iPhone", available at http://www.maclife.com/article/howtos/howto_setup_itunes_dj_your_ mac_and_iphone, archived on Mar. 17, 2009, 4 pages. cited by applicant .
Baudisch et al., "Flat Volume Control: Improving Usability by Hiding the Volume Control Hierarchy in the User Interface", 2004, 8 pages. cited by applicant .
Benslimane Abderrahim, "A Multimedia Synchronization Protocol for Multicast Groups," Proceedings of the 26th Euromicro Conference, 2000, pp. 456-463, vol. 1. cited by applicant .
Biersack et al., "Intra- and Inter-Stream Synchronization for Stored Multimedia Streams," IEEE International Conference on Multimedia Computing and Systems, 1996, pp. 372-381. cited by applicant .
Blakowski G. et al., "A Media Synchronization Survey: Reference Model, Specification, and Case Studies", Jan. 1996, vol. 14, No. 1, 5-35. cited by applicant .
Bluetooth. "Specification of the Bluetooth System: The ad hoc SCATTERNET for affordable and highly functional wireless connectivity," Core, Version 1.0 A, Jul. 26, 1999, 1068 pages. cited by applicant .
Bluetooth. "Specification of the Bluetooth System: Wireless connections made easy," Core, Version 1.0 B, Dec. 1, 1999, 1076 pages. cited by applicant .
Bretl W.E., et al., MPEG2 Tutorial [online], 2000 [retrieved on Jan. 13, 2009] Retrieved from the Internet:, pp. 1-23. cited by applicant .
Canadian Intellectual Property Office, Canadian Office Action dated Apr. 4, 2016, issued in connection with Canadian Patent Application No. 2,842,342, 5 pages. cited by applicant .
Canadian Intellectual Property Office, Canadian Office Action dated Sep. 14, 2015, issued in connection with Canadian Patent Application No. 2,842,342, 2 pages. cited by applicant .
Chakrabarti et al., "A Remotely Controlled Bluetooth Enabled Environment", IEEE, 2004, pp. 77-81. cited by applicant .
Corrected Notice of Allowance dated Aug. 19, 2015, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 2 pages. cited by applicant .
Creative, "Connecting Bluetooth Devices with Creative D200," http://support.creative.com/kb/ShowArticle.aspx?url=http://ask.creative.c- om:80/SRVS/CGI-BIN/WEBCGI.EXE/,/?St=106,E=0000000000396859016,K=9377,Sxi=8- , VARSET=ws:http://us.creative.com,case=63350>, available on Nov. 28, 2011, 2 pages. cited by applicant .
Crown PIP Manual available for sale at least 2004, 68 pages. cited by applicant .
Dell, Inc. "Dell Digital Audio Receiver: Reference Guide," Jun. 2000, 70 pages. cited by applicant .
Dell, Inc. "Start Here," Jun. 2000, 2 pages. cited by applicant .
European Extended Search Repor dated Feb. 28, 2014, issued in connection with EP Application No. 13184747.7, 8 pages. cited by applicant .
European Extended Search Report dated Mar. 7, 2016, issued in connection with EP Application No. 13810340.3, 9 pages. cited by applicant .
European Extended Search Report dated Mar. 31, 2015, issued in connection with EP Application No. 14181454.1, 9 pages. cited by applicant .
European Patent Office, Examination Report dated Mar. 22, 2016, issued in connection with European Patent Application No. EP14181454.1, 6 pages. cited by applicant .
Falcone, John, "Sonos BU150 Digital Music System review," CNET, CNET [online] Jul. 27, 2009 [retrieved on Mar. 16, 2016], 11 pages Retrieved from the Internet: URL:http://www.cnet.com/products/sonos-bu150-digital-music-system/. cited by applicant .
Final Office Action dated Jun. 5, 2014, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 12 pages. cited by applicant .
Final Office Action dated Jul. 13, 2009, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 16 pages. cited by applicant .
Final Office Action dated Sep. 13, 2012, issued in connection with U.S. Appl. No. 13/297,000, filed Nov. 15, 2011, 17 pages. cited by applicant .
Final Office Action dated Nov. 18, 2015, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 56 pages. cited by applicant .
Final Office Action dated Oct. 21, 2011, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 19 pages. cited by applicant .
Final Office Action dated Mar. 27, 2014, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 29 pages. cited by applicant .
Final Office Action dated Jan. 28, 2011, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 21 pages. cited by applicant .
Final Office Action dated Jun. 30, 2008, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 30 pages. cited by applicant .
Final Office Action dated Aug. 3, 2015, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 13 pages. cited by applicant .
Final Office Action dated Dec. 3, 2014, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 12 pages. cited by applicant .
Final Office Action dated Jul. 3, 2012, issued in connection with U.S. Appl. No. 13/298,090, filed Nov. 16, 2011, 41 pages. cited by applicant .
Final Office Action dated Mar. 3, 2015, issued in connection with U.S. Appl. No. 13/864,251, filed Apr. 17, 2013, 13 pages. cited by applicant .
Final Office Action dated Mar. 4, 2015, issued in connection with U.S. Appl. No. 13/848,904, filed Mar. 22, 2013, 16 pages. cited by applicant .
Final Office Action dated Mar. 5, 2015, issued in connection with U.S. Appl. No. 13/888,203, filed May 6, 2013, 13 pages. cited by applicant .
Final Office Action dated Jan. 7, 2015, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 14 pages. cited by applicant .
Final Office Action dated Mar. 9, 2015, issued in connection with U.S. Appl. No. 14/516,867, filed Oct. 17, 2014, 14 pages. cited by applicant .
Final Office Action dated Aug. 10, 2015, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 26 pages. cited by applicant .
Final Office Action dated Aug. 11, 2015, issued in connection with U.S. Appl. No. 13/864,247, filed Apr. 17, 2013, 15 pages. cited by applicant .
Final Office Action dated Feb. 11, 2015, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 13 pages. cited by applicant .
Final Office Action dated Feb. 11, 2015, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 17 pages. cited by applicant .
Final Office Action dated Feb. 12, 2015, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 20 pages. cited by applicant .
Final Office Action dated Oct. 13, 2011, issued in connection with U.S. Appl. No. 12/035,112, filed Feb. 21, 2008, 10 pages. cited by applicant .
Final Office Action dated Jul. 15, 2015, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 18 pages. cited by applicant .
Final Office Action dated Jun. 15, 2015, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 25 pages. cited by applicant .
Final Office Action dated Dec. 17, 2014, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 36 pages. cited by applicant .
Final Office Action dated Jan. 21, 2010, issued in connection with U.S. Appl. No. 11/906,702, filed Oct. 2, 2007, 27 pages. cited by applicant .
Final Office Action dated Oct. 22, 2014, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 12 pages. cited by applicant .
Final Office Action dated Oct. 23, 2014, issued in conection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 23 pages. cited by applicant .
Final Office Action dated Feb. 24, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 28 pages. cited by applicant .
Final Office Action dated Apr. 28, 2015, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 20 pages. cited by applicant .
Final Office Action dated Nov. 30, 2015, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 26 pages. cited by applicant .
First Action Interview Office Action Summary dated Apr. 15, 2015, issued in connection with U.S. Appl. No. 14/505,027, filed Oct. 2, 2014, 6 pages. cited by applicant .
Fulton et al., "The Network Audio System: Make Your Application Sing (As Well As Dance)!", The X Resource, 1994, 14 pages. cited by applicant .
Hans et al., "Interacting with Audio Streams for Entertainment and Communication", Proceedings of the Eleventh ACM International Conference on Multimedia, ACM, 2003, 7 pages. cited by applicant .
Horwitz, Jeremy, "Logic3 i-Station25," retrieved from the internet: http://www.ilounge.com/index.php/reviews/entry/logic3-i-station25/, last visited Dec. 17, 2013, 5 pages. cited by applicant .
Huang C.M., et al., "A Synchronization Infrastructure for Multicast Multimedia at the Presentation Layer," IEEE Transactions on Consumer Electronics, 1997, pp. 370-380, vol. 43, No. 3. cited by applicant .
International Bureau, International Preliminary Report on Patentability, dated Jan. 8, 2015, issued in connection with International Application No. PCT/US2013/046372, filed on Jun. 18, 2013, 6 pages. cited by applicant .
International Bureau, International Preliminary Report on Patentability, dated Jan. 8, 2015, issued in connection with International Application No. PCT/US2013/046386, filed on Jun. 18, 2013, 8 pages. cited by applicant .
International Bureau, International Preliminary Report on Patentability dated Jan. 30, 2014, issued in connection with International Application No. PCT/US2012/045894, filed on Jul. 9, 2012, 6 pages. cited by applicant .
International Searching Authority, International Search Report dated Aug. 1, 2008, in connection with International Application No. PCT/US2004/023102, 5 pages. cited by applicant .
International Searching Authority, International Search Report dated Dec. 26, 2012, issued in connection with International Application No. PCT/US2012/045894, filed on Jul. 9, 2012, 3 pages. cited by applicant .
International Searching Authority, International Search Report, dated Sep. 30, 2013, issued in connection with PCT Application No. PCT/US2013/046386, 3 pages. cited by applicant .
International Searching Authority, International Search Report dated Aug. 26, 2013, issued in connection with International Patent application No. PCT/US2013/046372, 3 pages. cited by applicant .
International Searching Authority, Written Opinion dated Aug. 26, 2013, issued in connection with International Application No. PCT/US2013/046372, filed on Jun. 18, 2013, 4 pages. cited by applicant .
International Searching Authority, Written Opinion dated Dec. 26, 2012, issued in connection with International Application No. PCT/US2012/045894, filed on Jul. 9, 2012, 4 pages. cited by applicant .
International Searching Authority, Written Opinion dated Sep. 30, 2013, issued in connection with International Application No. PCT/US2013/046386, filed on Jun. 18, 2013, 6 pages. cited by applicant .
Ishibashi et al., "A Group Synchronization Mechanism for Live Media in Multicast Communications," IEEE Global Telecommunications Conference, 1997, pp. 746-752, vol. 2. cited by applicant .
Ishibashi et al., "A Group Synchronization Mechanism for Stored Media in Multicast Communications," IEEE Information Revolution and Communications, 1997, pp. 692-700, vol. 2. cited by applicant .
Japanese Intellectual Property Office, Decision of Rejection dated Jul. 8, 2014, issued in connection with Japanese Patent Application No. 2012-178711, 3 pages. cited by applicant .
Japanese Intellectual Property Office, Office Action Summary dated Feb. 2, 2016, issued in connection with Japanese Patent Application No. 2015-520286, 6 pages. cited by applicant .
Japanese Intellectual Property Office, Office Action Summary dated Nov. 19, 2013, issued in connection with Japanese Patent Application No. 2012-178711, 5 pages. cited by applicant .
Japanese Patent Office, Notice of Rejection, dated Feb. 3, 2015, issued in connection with Japanese Patent Application No. 2014-521648, 7 pages. cited by applicant .
Japanese Patent Office, Notice of Rejection dated Sep. 15, 2015, issued in connection with Japanese Patent Application No. 2014-220704, 7 pages. cited by applicant .
Japanese Patent Office, Office Action dated Mar. 29, 2016, issued in connection with Japanese Patent Application No. JP2015-520288, 12 pages. cited by applicant .
Jo et al., "Synchronized One-to-many Media Streaming with Adaptive Playout Control," Proceedings of SPIE, 2002, pp. 71-82, vol. 4861. cited by applicant .
Jones, Stephen, "Dell Digital Audio Receiver: Digital upgrade for your analog stereo" Analog Stereo Jun. 24, 2000 retrieved Jun. 18, 2014, 2 pages. cited by applicant .
Levergood et al., "AudioFile: A Network-Transparent System for Distributed Audio Applications", Digital Equipment Corporation, 1993, 109 pages. cited by applicant .
Louderback, Jim, "Affordable Audio Receiver Furnishes Homes With MP3," TechTV Vault. Jun. 28, 2000 retrieved Jul. 10, 2014, 2 pages. cited by applicant .
Advisory Action dated Mar. 8, 2017, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 22 pages. cited by applicant .
AVTransport:1 Service Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (66 pages). cited by applicant .
Chinese Patent Office, Second Office Action dated Feb. 27, 2017, issued in connection with Chinese Patent Application No. 201380044380.2, 22 pages. cited by applicant .
Connection Manager: 1 Service Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (25 pages). cited by applicant .
ContentDirectory:1 Service Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (89 pages). cited by applicant .
Designing a UPnP AV MediaServer, Nelson Kidd (2003) (SONDM000115062-116) (55 pages). cited by applicant .
Final Office Action dated Jun. 2, 2017, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 32 pages. cited by applicant .
Final Office Action dated Apr. 10, 2017, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 15 pages. cited by applicant .
Final Office Action dated May 15, 2017, issued in connection with U.S. Appl. No. 13/864,252, filed Apr. 17, 2013, 13 pages. cited by applicant .
Final Office Action dated May 16, 2017, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 14 pages. cited by applicant .
Final Office Action dated May 16, 2017, issued in connection with U.S. Appl. No. 13/864,250, filed Apr. 17, 2013, 12 pages. cited by applicant .
Final Office Action dated Apr. 20, 2017, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 14 pages. cited by applicant .
Fries et al. "The MP3 and Internet Audio Handbook: Your Guide to the Digital Music Revolution." 2000, 320 pages. cited by applicant .
General Event Notification Architecture Base: Client to Arbiter (Apr. 2000) (23 pages). cited by applicant .
Home Networking with Universal Plug and Play, IEEE Communications Magazine, vol. 39 No. 12 (Dec. 2001) (D+M_0402025-40) (16 pages). cited by applicant .
Implicit, LLC v. Sonos, Inc. (No. 14-1330-RGA), Defendant's Original Complaint (Mar. 3, 2017) (15 pages). cited by applicant .
Intel Designing a UPnP AV Media Renderer, v. 1.0 ("Intel AV Media Renderer") (May 20, 2003) (SONDM000115117-62) (46 pages). cited by applicant .
Intel Media Renderer Device Interface ("Intel Media Renderer") (Sep. 6, 2002) (62 pages). cited by applicant .
Intel SDK for UPnP Devices Programming Guide, Version 1.2.1, (Nov. 2002) (30 pages). cited by applicant .
Linux SDK for UPnP Devices v. 1.2 (Sep. 6, 2002) (101 pages). cited by applicant .
MediaRenderer:1 Device Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (12 pages). cited by applicant .
MediaServer:1 Device Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (12 pages). cited by applicant .
Microsoft, Universal Plug and Play (UPnP) Client Support ("Microsoft UPnP") (Aug. 2001) (D+M_0402007-24) (18 pages). cited by applicant .
Microsoft Window's XP Reviewer's Guide (Aug. 2001) (D+M_0402225-85) (61 pages). cited by applicant .
"Microsoft Windows XP File and Printer Share with Microsoft Windows" Microsoft Windows XP Technical Article, 2003, 65 pages. cited by applicant .
"SMPTE Made Simple: A Time Code Tutor by Timeline," 1996, 46 pages. cited by applicant .
Network Time Protocol (NTP), RFC 1305 (Mar. 1992) (D+M_0397417-536) (120 pages). cited by applicant .
Niederst, Jennifer "O'Reilly Web Design in a Nutshell," Second Edition, Sep. 2001, 678 pages. cited by applicant .
Non-Final Office Action dated Apr. 10, 2017, issued in connection with U.S. Appl. No. 13/871,785, filed Apr. 26, 2013, 10 pages. cited by applicant .
Non-Final Office Action dated Apr. 20, 2017, issued in connection with U.S. Appl. No. 90/013,882, filed Dec. 27, 2016, 197 pages. cited by applicant .
Notice of Allowance dated Jun. 1, 2017, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 12 pages. cited by applicant .
Notice of Allowance dated Apr. 3, 2017, issued in connection with U.S. Appl. No. 15/088,678, filed Apr. 1, 2016, 8 pages. cited by applicant .
Notice of Allowance dated Mar. 9, 2017, issued in connection with U.S. Appl. No. 15/080,591, filed Mar. 25, 2016, 7 pages. cited by applicant .
Notice of Allowance dated Feb. 10, 2017, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 13 pages. cited by applicant .
Notice of Allowance dated Mar. 15, 2017, issued in connection with U.S. Appl. No. 15/080,716, filed Mar. 25, 2016, 7 pages. cited by applicant .
Notice of Allowance dated Apr. 25, 2017, issued in connection with U.S. Appl. No. 15/156,392, filed May 17, 2016, 8 pages. cited by applicant .
Notice of Allowance dated Mar. 27, 2017, issued in connection with U.S. Appl. No. 15/089,758, filed Apr. 4, 2016, 7 pages. cited by applicant .
Notice of Allowance dated Mar. 28, 2017, issued in connection with U.S. Appl. No. 15/088,906, filed Apr. 1, 2016, 7 pages. cited by applicant .
Notice of Allowance dated Mar. 28, 2017, issued in connection with U.S. Appl. No. 15/155,149, filed May 16, 2016, 7 pages. cited by applicant .
Request for Ex Parte Reexamination submitted in U.S. Pat. No. 9,213,357 on May 22, 2017, 85 pages. cited by applicant .
RenderingControl:1 Service Template Version 1.01 for UPnP, Version 1.0, (Jun. 25, 2002) (SONDM000115187-249) (63 pages). cited by applicant .
Notice of Allowance dated Mar. 30, 2017, issued in connection with U.S. Appl. No. 15/088,532, filed Apr. 1, 2016, 7 pages. cited by applicant .
Notice of Allowance dated Apr. 6, 2017, issued in connection with U.S. Appl. No. 15/088,283, filed Apr. 1, 2016, 7 pages. cited by applicant .
Notice of Incomplete Re-Exam Request dated May 25, 2017, issued in connection with U.S. Appl. No. 90/013,959, filed Apr. 1, 2016, 10 pages. cited by applicant .
Pre-Brief Conference Decision dated May 11, 2017, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 2 pages. cited by applicant .
Real Time Control Protocol (RTCP) and Realtime Transfer Protocol (RTP), RFC 1889 (Jan. 1996) (D+M_0397810-84) (75 pages). cited by applicant .
Realtime Streaming Protocol (RTSP), RFC 2326 (Apr. 1998) (D+M_0397945-8036) (92 pages). cited by applicant .
Realtime Transport Protocol (RTP), RFC 3550 (Jul. 2003) (D+M_0398235-323) (89 pages). cited by applicant .
Advisory Action dated Jun. 20, 2017, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 5 pages. cited by applicant .
Final Office Action dated Jul. 11, 2017, issued in connection with U.S. Appl. No. 15/243,186, filed Aug. 22, 2016, 13 pages. cited by applicant .
Final Office Action dated Jun. 15, 2017, issued in connection with U.S. Appl. No. 15/228,639, filed Aug. 4, 2016, 16 pages. cited by applicant .
Final Office Action dated Jun. 28, 2017, issued in connection with U.S. Appl. No. 14/808,875, filed Jul. 24, 2015, 14 pages. cited by applicant .
First Action Pre-Interview Office Action dated Jun. 22, 2017, issued in connection with U.S. Appl. No. 14/516,883, filed Oct. 17, 2014, 5 pages. cited by applicant .
Non-Final Office Action dated Jul. 11, 2017, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 10 pages. cited by applicant .
Non-Final Office Action dated Jul. 26, 2017, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 14 pages. cited by applicant .
Notice of Allowance dated Jul. 12, 2017, issued in connection with U.S. Appl. No. 13/894,179, filed May 14, 2013, 10 pages. cited by applicant .
Notice of Allowance dated Jul. 13, 2017, issued in connection with U.S. Appl. No. 13/895,076, filed May 15, 2013, 10 pages. cited by applicant .
Renewed Request for Ex Parte Re-Examination, U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 126 pages. cited by applicant .
European Patent Office, Examination Report dated Oct. 24, 2016, issued in connection with European Patent Application No. 13808623.6, 4 pages. cited by applicant .
"Final Office Action dated Oct. 19, 2016, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 14 pages". cited by applicant .
Non-Final Office Action dated Nov. 3, 2016, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 17 pages. cited by applicant .
Non-Final Office Action dated Oct. 4, 2016, issued in connection with U.S. Appl. No. 15/089,758, filed Apr. 4, 2016, 9 pages. cited by applicant .
Non-Final Office Action dated Oct. 5, 2016, issued in connection with U.S. Appl. No. 13/864,250, filed Apr. 17, 2013, 10 pages. cited by applicant .
Non-Final Office Action dated Oct. 5, 2016, issued in connection with U.S. Appl. No. 13/864,252, filed Apr. 17, 2013, 11 pages. cited by applicant .
Non-Final Office Action dated Oct. 6, 2016, issued in connection with U.S. Appl. No. 15/088,678, filed Apr. 1, 2016, 9 pages. cited by applicant .
Non-Final Office Action dated Oct. 7, 2016, issued in connection with U.S. Appl. No. 15/156,392, filed May 17, 2016, 8 pages. cited by applicant .
Non-Final Office Action dated Nov. 10, 2016, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 11 pages. cited by applicant .
Non-Final Office Action dated Sep. 21, 2016, issued in connection with U.S. Appl. No. 15/080,591, filed Mar. 25, 2016, 9 pages. cited by applicant .
Non-Final Office Action dated Sep. 21, 2016, issued in connection with U.S. Appl. No. 15/080,716, filed Mar. 25, 2016, 8 pages. cited by applicant .
Non-Final Office Action dated Sep. 21, 2016, issued in connection with U.S. Appl. No. 15/088,283, filed Apr. 1, 2016, 8 pages. cited by applicant .
Non-Final Office Action dated Sep. 21, 2016, issued in connection with U.S. Appl. No. 15/088,532, filed Apr. 1, 2016, 9 pages. cited by applicant .
Non-Final Office Action dated Sep. 22, 2016, issued in connection with U.S. Appl. No. 15/088,906, filed Apr. 1, 2016, 9 pages. cited by applicant .
Non-Final Office Action dated Sep. 22, 2016, issued in connection with U.S. Appl. No. 15/155,149, filed May 16, 2016, 7 pages. cited by applicant .
Non-Final Office Action dated Sep. 30, 2016, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 12 pages. cited by applicant .
Notice of Allowance dated Oct. 19, 2016, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 14 pages. cited by applicant .
Advisory Action dated Nov. 1, 2013, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 3 pages. cited by applicant .
Advisory Action dated Aug. 10, 2017, issued in connection with U.S. Appl. No. 13/864,250, filed Apr. 17, 2013, 3 pages. cited by applicant .
Advisory Action dated Jan. 10, 2018, issued in connection with U.S. Appl. No. 13/871,785, filed Apr. 26, 2013, 3 pages. cited by applicant .
Advisory Action dated Jun. 11, 2018, issued in connection with U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 3 pages. cited by applicant .
Advisory Action dated Aug. 16, 2017, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 5 pages. cited by applicant .
Advisory Action dated Aug. 22, 2017, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 3 pages. cited by applicant .
Advisory Action dated Sep. 22, 2017, issued in connection with U.S. Appl. No. 15/243,186, filed Aug. 22, 2016, 3 pages. cited by applicant .
Advisory Action dated Apr. 27, 2016, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 7 pages. cited by applicant .
Canadian Patent Office, Office Action dated Jul. 10, 2018, issued in connection with Canadian Application No. 2982726, 3 pages. cited by applicant .
Corrected Notice of Allowability dated Dec. 23, 2016, issued in connection with U.S. Appl. No. 14/803,953, filed Jul. 20, 2015, 18 pages. cited by applicant .
European Patent Office, European Office Action dated Sep. 1, 2017, issued in connection with European Application No. 13184747.7, 7 pages. cited by applicant .
European Patent Office, Summons to Attend Oral Proceedings dated Jul. 10, 2018, issued in connection with European Application No. 13184747.7, 10 pages. cited by applicant .
File History of Re-Examination U.S. Appl. No. 90/013,423 (Sonos Ref No. 12-0902-REX). cited by applicant .
Final Office Action dated Jul. 5, 2013, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 22 pages. cited by applicant .
Final Office Action dated Jun. 6, 2018, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 14 pages. cited by applicant .
Final Office Action dated Mar. 8, 2017, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 39 pages. cited by applicant .
Final Office Action dated Nov. 8, 2017, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 13 pages. cited by applicant .
Final Office Action dated Nov. 8, 2017, issued in connection with U.S. Appl. No. 13/871,785, filed Apr. 26, 2013, 12 pages. cited by applicant .
Final Office Action dated Mar. 1, 2018, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 16 pages. cited by applicant .
Final Office Action dated Jul. 11, 2018, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 13 pages. cited by applicant .
Final Office Action dated Jul. 11, 2018, issued in connection with U.S. Appl. No. 13/864,252, filed Apr. 17, 2013, 10 pages. cited by applicant .
Final Office Action dated Aug. 14, 2009, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 28 pages. cited by applicant .
Final Office Action dated Feb. 15, 2018, issued in connection with U.S. Appl. No. 14/516,883, filed Oct. 17, 2014, 17 pages. cited by applicant .
Final Office Action dated Mar. 16, 2011, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 40 pages. cited by applicant .
Final Office Action dated Mar. 16, 2018, issued in connection with U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 39 pages. cited by applicant .
Final Office Action dated May 16, 2018, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 11 pages. cited by applicant .
Final Office Action dated Dec. 24, 2009, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 29 pages. cited by applicant .
Final Office Action dated Mar. 29, 2018, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 24 pages. cited by applicant .
Final Office Action dated Dec. 31, 2015, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 34 pages. cited by applicant .
First Office Action Interview dated Aug. 30, 2017, issued in connection with U.S. Appl. No. 14/516,883, filed Oct. 17, 2014, 5 pages. cited by applicant .
Japanese Patent Office, Japanese Office Action dated Oct. 3, 2017, issued in connection with Japanese Application No. 2016-163042, 5 pages. cited by applicant .
Japanese Patent Office, Office Action dated May 15, 2018, issued in connection with Japanese Application No. 2016-163042, 6 pages. cited by applicant .
Japanese Patent Office, Translation of Office Action dated May 15, 2018, issued in connection with Japanese Application No. 2016-163042, 4 pages. cited by applicant .
Non-Final Office Action dated Sep. 1, 2010, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 36 pages. cited by applicant .
Non-Final Office Action dated Nov. 2, 2016, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 37 pages. cited by applicant .
Non-Final Office Action dated Feb. 3, 2009, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 32 pages. cited by applicant .
Non-Final Office Action dated Oct. 3, 2014, issued in connection with U.S. Appl. No. 13/863,083, filed Apr. 15, 2013, 22 pages. cited by applicant .
Non-Final Office Action dated Nov. 7, 2011, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 48 pages. cited by applicant .
Non-Final Office Action dated Jan. 10, 2018, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 18 pages. cited by applicant .
Non-Final Office Action dated Nov. 13, 2017, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 11 pages. cited by applicant .
Non-Final Office Action dated Dec. 14, 2017, issued in connection with U.S. Appl. No. 15/081,911, filed Mar. 27, 2016, 17 pages. cited by applicant .
Non-Final Office Action dated Nov. 14, 2017, issued in connection with U.S. Appl. No. 13/864,252, filed Apr. 17, 2013, 11 pages. cited by applicant .
Non-Final Office Action dated Aug. 15, 2017, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 11 pages. cited by applicant .
Non-Final Office Action dated Jul. 15, 2016, issued in connection with U.S. Appl. No. 14/803,953, filed Jul. 20, 2015, 20 pages. cited by applicant .
Non-Final Office Action dated Nov. 15, 2017, issued in connection with U.S. Appl. No. 15/228,639, filed Aug. 4, 2016, 14 pages. cited by applicant .
Non-Final Office Action dated Nov. 15, 2017, issued in connection with U.S. Appl. No. 15/243,186, filed Aug. 22, 2016, 13 pages. cited by applicant .
Non-Final Office Action dated Aug. 17, 2017, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 12 pages. cited by applicant .
Non-Final Office Action dated Jan. 18, 2013, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 58 pages. cited by applicant .
Non-Final Office Action dated Apr. 2, 2018, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 20 pages. cited by applicant .
Non-Final Office Action dated Oct. 20, 2017, issued in connection with U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 50 pages. cited by applicant .
Non-Final Office Action dated Oct. 23, 2014, issued in connection with U.S. Appl. No. 13/932,864, filed Jul. 1, 2013, 20 pages. cited by applicant .
Non-Final Office Action dated Apr. 24, 2018, issued in connection with U.S. Appl. No. 15/095,145, filed Apr. 10, 2016, 13 pages. cited by applicant .
Non-Final Office Action dated Aug. 28, 2017, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 17 pages. cited by applicant .
Non-Final Office Action dated Nov. 28, 2017, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 12 pages. cited by applicant .
Non-Final Office Action dated Aug. 29, 2017, issued in connection with U.S. Appl. No. 14/058,166, filed Oct. 18, 2013, 12 pages. cited by applicant .
Non-Final Office Action dated Jul. 30, 2018, issued in connection with U.S. Appl. No. 16/009,182, filed Jun. 14, 2018, 22 pages. cited by applicant .
Non-Final Office Action dated Dec. 31, 2013, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 26 pages. cited by applicant .
Non-Final Office Action dated Jan. 9, 2018, issued in connection with U.S. Appl. No. 13/864,250, filed Apr. 17, 2013, 13 pages. cited by applicant .
Notice of Allowance dated Jun. 2, 2017, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 10 pages. cited by applicant .
Notice of Allowance dated Jul. 6, 2018, issued in connection with U.S. Appl. No. 14/058,166, filed Oct. 18, 2013, 19 pages. cited by applicant .
Notice of Allowance dated Aug. 10, 2018, issued in connection with U.S. Appl. No. 15/081,911, filed Mar. 27, 2016, 5 pages. cited by applicant .
Notice of Allowance dated Jul. 10, 2018, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 9 pages. cited by applicant .
Notice of Allowance dated May 10, 2018, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 8 pages. cited by applicant .
Notice of Allowance dated Aug. 14, 2012, issued in connection withU.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 33 pages. cited by applicant .
Notice of Allowance dated Jul. 18, 2014, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 8 pages. cited by applicant .
Notice of Allowance dated Nov. 23, 2016, issued in connection with U.S. Appl. No. 14/803,953, filed Jul. 20, 2015, 21 pages. cited by applicant .
Notice of Allowance dated Apr. 27, 2015, issued in connection with U.S. Appl. No. 13/932,864, filed Jul. 1, 2013, 20 pages. cited by applicant .
Notice of Allowance dated Jun. 27, 2018, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 19 pages. cited by applicant .
Notice of Allowance dated Apr. 29, 2015, issued in connection with U.S. Appl. No. 13/863,083, filed Apr. 15, 2013, 19 pages. cited by applicant .
Notice of Allowance dated Mar. 29, 2017, issued in connection with U.S. Appl. No. 14/803,953, filed Jul. 20, 2015, 8 pages. cited by applicant .
Notice of Allowance dated Jan. 31, 2018, issued in connection with U.S. Appl. No. 13/871,785, filed Apr. 26, 2013, 6 pages. cited by applicant .
Notice of Intent to Issue Re-Examination Certificate dated Aug. 3, 2017, issued in connection with U.S. Appl. No. 90/013,882, filed Dec. 27, 2016, 20 pages. cited by applicant .
"Sonos Multi-Room Music System User Guide," Version 090401, Sonos, Inc. Apr. 1, 2009, 256 pages. cited by applicant .
Pro Tools Reference Guide Version 5.3 Manual, 2002, 582 pages. cited by applicant .
Reexam Non-Final Office Action dated Nov. 9, 2016, issued in connection with U.S. Appl. No. 90/013,774, filed Jun. 29, 2016, 35 pages. cited by applicant .
Solid State Logic G Series Systems Operator's Shortform Guide: SSL G Series Console, 1994, 49 pages. cited by applicant .
Solid State Logic SL 9000 J Series Total Studio System: Console Operator's Manual, 1994, 415 pages. cited by applicant .
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' 35 U.S.C. .sctn. 282 Notice filed Nov. 2, 2017, 31 pages. cited by applicant .
Sonos System Overview, Version 1.0, Jul. 2011, 12 pages. cited by applicant.

Primary Examiner: McCord; Paul C
Attorney, Agent or Firm: McDonnell Boehnen Hulbert & Berghoff LLP

Parent Case Text



CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. application Ser. No. 14/808,875 filed on Jul. 24, 2015, and currently pending; U.S. application Ser. No. 14/808,875 is a continuation of U.S. application Ser. No. 13/907,666 filed on May 31, 2013, and issued on Sep. 22, 2015, as U.S. Pat. No. 9,141,645; U.S. application Ser. No. 13/907,666 is a continuation of U.S. application Ser. No. 13/619,237 filed on Sep. 14, 2012, and issued on Nov. 19, 2013, as U.S. Pat. No. 8,588,949; U.S. application Ser. No. 13/619,237 is a continuation of U.S. application Ser. No. 12/035,112 filed on Feb. 21, 2008, and issued on Oct. 16, 2012, as U.S. Pat. No. 8,290,603; U.S. application Ser. No. 12/035,112 is a continuation-in-part of U.S. application Ser. No. 10/861,653 filed on Jun. 5, 2004, and issued on Aug. 4, 2009, as U.S. Pat. No. 7,571,014; U.S. application Ser. No. 10/861,653 is a continuation-in-part of U.S. application Ser. No. 10/816,217 filed on Apr. 1, 2004, and issued on Jul. 31, 2012, as U.S. Pat. No. 8,234,395; U.S. application Ser. No. 10/816,217 claims priority to U.S. Prov. App. 60/490,768 filed on Jul. 28, 2003. The entire contents of the U.S. application Ser. Nos. 14/808,875; 13/907,666; 13/619,237; 12/035,112; 10/861,653; 10/816,217; and 60/490,768 applications are incorporated herein by reference.
Claims



The invention claimed is:

1. A computing device comprising: one or more processors; and tangible, non-transitory computer-readable memory having stored thereon instructions, wherein the instructions, when executed by the one or more processors cause the computing device to perform functions comprising: displaying on a first user interface, a first plurality of graphical representations, wherein a first graphical representation of the first plurality of graphical representations corresponds to a first zone in a media playback system, and wherein a second graphical representation of the first plurality of graphical representations corresponds to a second zone in the media playback system; receiving via the first user interface, an input to create a zone group comprising the first zone and the second zone; after receiving the input to create the zone group comprising the first zone and the second zone, displaying on a second user interface, a second plurality of graphical representations, wherein a first graphical representation of the second plurality of graphical representations corresponds to the first zone, wherein a second graphical representation of the second plurality of graphical representations corresponds to the second zone, and wherein a third graphical representation of the second plurality of graphical representations corresponds to the zone group; receiving via the second user interface, an input to cause the zone group to play audio content; in response to the input to cause the zone group to play audio content, transmitting to at least one of the first and second zones, a command to play audio content as the zone group; and changing both a first volume setting of the first zone to a second volume setting of the first zone and a first volume setting of the second zone to a second volume setting of the second zone by adjusting a group volume control representation, wherein a value of the group volume control representation always corresponds to an average of the first volume setting and the second volume setting.

2. The computing device of claim 1, wherein receiving via the first user interface, the input to create a zone group comprising the first zone and the second zone comprises: receiving via the first user interface, an input to create a zone group; and after receiving the input to create the zone group, receiving input indicating (i) a selection of the first zone and (ii) a selection of the second zone.

3. The computing device of claim 1, wherein receiving via the second user interface, the input to cause the zone group to play audio content comprises: receiving via the second user interface, an input to play the audio content; and after receiving the input to play the audio content, receiving an input indicating a selection of the third graphical representation of the second plurality of graphical representations.

4. The computing device of claim 1, wherein receiving via the second user interface, the input to cause the zone group to play audio content comprises: receiving via the second user interface, an input indicating a selection of the third graphical representation of the second plurality of graphical representations; and after receiving the input indicating the selection of the third graphical representation, receiving an input to play audio content.

5. The computing device of claim 1, wherein the command to play audio content as the zone group comprises a command to cause at least the first and second zones to play the audio content in synchrony.

6. The computing device of claim 1, wherein transmitting to at least one of the first and second zones, a command to play audio content as the zone group comprises: transmitting to at least one of the first and second zones, a network address from which the audio content can be received.

7. The computing device of claim 1, wherein the functions further comprise: after receiving via the first user interface, the input to create a zone group comprising the first zone and the second zone, transmitting to a first computing device of two or more computing devices in the zone group, a command to become a zone group head of the two or more computing devices in the zone group.

8. A tangible, non-transitory computer-readable medium having instructions stored thereon, wherein the instructions, when executed by one or more processors, cause a computing device to perform functions comprising: displaying on a first user interface, a first plurality of graphical representations, wherein a first graphical representation of the first plurality of graphical representations corresponds to a first zone in a media playback system, and wherein a second graphical representation of the first plurality of graphical representations corresponds to a second zone in the media playback system; receiving via the first user interface, an input to create a zone group comprising the first zone and the second zone; after receiving the input to create the zone group comprising the first zone and the second zone, displaying on a second user interface, a second plurality of graphical representations, wherein a first graphical representation of the second plurality of graphical representations corresponds to the first zone, wherein a second graphical representation of the second plurality of graphical representations corresponds to the second zone, and wherein a third graphical representation of the second plurality of graphical representations corresponds to the zone group; receiving via the second user interface, an input to cause the zone group to play audio content; in response to the input to cause the zone group to play audio content, transmitting to at least one of the first and second zones, a command to play audio content as the zone group; and changing both a first volume setting of the first zone to a second volume setting of the first zone and a first volume setting of the second zone to a second volume setting of the second zone by adjusting a group volume control representation, wherein a value of the group volume control representation always corresponds to an average of the first volume setting and the second volume setting.

9. The tangible, non-transitory computer-readable medium of claim 8, wherein receiving via the first user interface, the input to create a zone group comprising the first zone and the second zone comprises: receiving via the first user interface, an input to create a zone group; and after receiving the input to create the zone group, receiving input indicating (i) a selection of the first zone and (ii) a selection of the second zone.

10. The tangible, non-transitory computer-readable medium of claim 8, wherein receiving via the second user interface, the input to cause the zone group to play audio content comprises: receiving via the second user interface, an input to play the audio content; and after receiving the input to play the audio content, receiving an input indicating a selection of the third graphical representation of the second plurality of graphical representations.

11. The tangible, non-transitory computer-readable medium of claim 8, wherein receiving via the second user interface, the input to cause the zone group to play audio content comprises: receiving via the second user interface, an input indicating a selection of the third graphical representation of the second plurality of graphical representations; and after receiving the input indicating the selection of the third graphical representation, receiving an input to play audio content.

12. The tangible, non-transitory computer-readable medium of claim 8, wherein the command to play audio content as the zone group comprises a command to cause at least the first and second zones to play the audio content in synchrony.

13. The tangible, non-transitory computer-readable medium of claim 8, wherein transmitting to at least one of the first and second zones, a command to play audio content as the zone group comprises: transmitting to at least one of the first and second zones, a network address from which the audio content can be received.

14. The tangible, non-transitory computer-readable medium of claim 8, wherein the functions further comprise: after receiving via the first user interface, the input to create a zone group comprising the first zone and the second zone, transmitting to a first computing device of two or more computing devices in the zone group, a command to become a zone group head of the two or more computing devices in the zone group.

15. A method comprising: displaying by a computing device on a first user interface, a first plurality of graphical representations, wherein a first graphical representation of the first plurality of graphical representations corresponds to a first zone in a media playback system, and wherein a second graphical representation of the first plurality of graphical representations corresponds to a second zone in the media playback system; receiving by the computing device via the first user interface, an input to create a zone group comprising the first zone and the second zone; after receiving the input to create the zone group comprising the first zone and the second zone, displaying by the computing device on a second user interface, a second plurality of graphical representations, wherein a first graphical representation of the second plurality of graphical representations corresponds to the first zone, wherein a second graphical representation of the second plurality of graphical representations corresponds to the second zone, and wherein a third graphical representation of the second plurality of graphical representations corresponds to the zone group; receiving by the computing device via the second user interface, an input to cause the zone group to play audio content; in response to the input to cause the zone group to play audio content, transmitting by the computing device to at least one of the first and second zones, a command to play audio content as the zone group; and changing both a first volume setting of the first zone to a second volume setting of the first zone and a first volume setting of the second zone to a second volume setting of the second zone by adjusting a group volume control representation, wherein a value of the group volume control representation always corresponds to an average of the first volume setting and the second volume setting.

16. The method of claim 15, wherein receiving by the computing device via the first user interface, the input to create a zone group comprising the first zone and the second zone comprises: receiving by the computing device via the first user interface, an input to create a zone group; and after receiving the input to create the zone group, receiving by the computing device, input indicating (i) a selection of the first zone and (ii) a selection of the second zone.

17. The method of claim 15, wherein receiving by the computing device via the second user interface, the input to cause the zone group to play audio content comprises: receiving by the computing device via the second user interface, an input to play the audio content; and after receiving the input to play the audio content, receiving by the computing device, an input indicating a selection of the third graphical representation of the second plurality of graphical representations.

18. The method of claim 15, wherein receiving by the computing device via the second user interface, the input to cause the zone group to play audio content comprises: receiving via the second user interface, an input indicating a selection of the third graphical representation of the second plurality of graphical representations; and after receiving the input indicating the selection of the third graphical representation, receiving an input to play audio content.

19. The method of claim 15, wherein the command to play audio content as the zone group comprises a command to cause at least the first and second zones to play the audio content in synchrony.

20. The method of claim 15, wherein transmitting by the computing device to at least one of the first and second zones, a command to play audio content as the zone group comprises: transmitting by the computing device to at least one of the first and second zones, a network address from which the audio content can be received.
Description



BACKGROUND OF THE INVENTION

Field of the Invention

The invention is generally related to the area of consumer electronics and human-computer interaction. In particular, the invention is related to user interfaces for controlling or manipulating a plurality of multimedia players in a multi-zone system.

An enduring passion for quality audio reproduction or system is continuing to drive demands from users. One of the demands includes an audio system in a house in which, for example, one could grill to classic rock on a patio while another one may cook up his/her own music selections in a kitchen. This is all at the same time while a teenager catches a ballgame in a family room, and another one blasts pop in a bedroom. And the best part of such audio system is that each family member does not need his or her own stereo system--one system gives everyone access to all the music sources.

Currently, one of the systems that can meet part of such demand is a conventional multi-zone audio system that usually includes a number of audio players. Each of the audio players has its own amplifier(s) and a set of speakers and typically installed in one place (e.g., a room). In order to play an audio source at one location, the audio source must be provided locally or from a centralized location. When the audio source is provided locally, the multi-zone audio system functions as a collection of many stereo systems, making source sharing difficult. When the audio source is provided centrally, the centralized location may include a juke box, many compact discs, an AM or FM radio, tapes, or others. To send an audio source to an audio player demanding such source, a cross-bar type of device is used to prevent the audio source from going to other audio players that may be playing other audio sources.

In order to achieve playing different audio sources in different audio players, the traditional multi-zone audio system is generally either hard-wired or controlled by a pre-configured and pre-programmed controller. While the pre-programmed configuration may be satisfactory in one situation, it may not be suitable for another situation. For example, a person would like to listen to broadcast news from his/her favorite radio station in a bedroom, a bathroom and a den while preparing to go to work in the morning. The same person may wish to listen in the den and the living room to music from a compact disc in the evening. In order to satisfy such requirements, two groups of audio players must be established. In the morning, the audio players in the bedroom, the bathroom and the den need to be grouped for the broadcast news. In the evening, the audio players in the den and the living room are grouped for the music. Over the weekend, the audio players in the den, the living room, and a kitchen are grouped for party music. Because the morning group, the evening group and the weekend group contain the den, it can be difficult for the traditional system to accommodate the requirement of dynamically managing the ad hoc creation and deletion of groups.

There is a need for dynamic control of the audio players as a group. With a minimum manipulation, the audio players may be readily grouped. There is further a need for user interfaces that may be readily utilized to group and control the audio players.

SUMMARY OF THE INVENTION

This section is for the purpose of summarizing some aspects of the present invention and to briefly introduce some preferred embodiments. Simplifications or omissions in this section as well as in the abstract or the title of this description may be made to avoid obscuring the purpose of this section, the abstract and the title. Such simplifications or omissions are not intended to limit the scope of the present invention.

In general, the present invention pertains to controlling a plurality of multimedia players, or simply players, in groups. According to one aspect of the present invention, a mechanism is provided to allow a user to group some of the players according to a theme or scene, where each of the players is located in a zone. When the scene is activated, the players in the scene react in a synchronized manner. For example, the players in the scene are all caused to play an audio source or music in a playlist, wherein the audio source may be located anywhere on a network.

According to another aspect of the present invention, various user interfaces are provided to facilitate a user to create and manage a group and also create, edit or update a playlist for the group. Depending on implementation, the user interfaces may be displayed on a touch screen from which a user may act directly with the screen to group the players, the user interfaces may also be displayed on a display with other means (e.g., a stylus, a scroll wheel, or arrow buttons) to interact. In addition, the user displays are configured to show graphically how many players in a group versus other individual players.

According to still another aspect of the present invention, the scene may be activated at any time or a specific time. A user may activate the scene at any time so that only some selected zones in an entertainment system facilitate a playback of an audio source. When the scene is activated at a specific time, the scene may be used as an alarm or buzzer.

According to still another aspect of the present invention, a controlling device (also referred to herein as controller) is provided to facilitate a user to select any of the players in the system to form respective groups each of which is set up per a scene. Although various scenes may be saved in any of the members in a group, commands are preferably sent from the controller to the rest of the members when one of the scenes is executed. Depending on implementation, the commands include parameters pertaining to identifiers of the players, volumes settings, audio source and etc.

According to yet another aspect of the present invention, a configurable module is implemented in the controlling device that provides interactive graphic user interface for forming, managing and controlling groups in the system, de-grouping a group or adjusting audio volume of individual players or a group of players.

The present invention may be implemented in many forms including software, hardware or a combination of both. According to one embodiment, the present invention is directed to a method for groupings in a multi-zone media system, the method comprises providing a mechanism to allow a user to determine which players in the system to be associated with a theme representing a group; and configuring the theme with parameters pertaining to the players, wherein the theme is activated at anytime or a specific time so that the players react in a synchronized manner. The players in a scene are synchronized to play a multimedia file when the scene is activated.

According to another embodiment, the present invention is directed to a method for groupings in a multi-zone media system, the method comprises providing a user interface to allow a user to determine which players in the system to be associated with a theme representing a group, the user interface showing all available players at the time the user interface is created; allowing the user to visually select one of the players to be a first member of the theme; allowing the user to add more of the available players to the theme, if desired; and configuring the theme with parameters pertaining to the players. The theme may be activated at anytime or a specific time so that the players react in a synchronized manner.

According to still another embodiment, the present invention is directed to an entertainment system for grouping players, the system comprises: a plurality of players, each located in one zone; and a controller providing a mechanism to allow a user to select which of the players to be associated with a theme representing a group; and configure the theme with parameters pertaining to the selected players, wherein the theme is activated at anytime or a specific time so that the selected players react in a synchronized manner. As a result, the selected players are synchronized to play a multimedia that is in a digital format and retrieved from a source over a network.

One of the objects, features, and advantages of the present invention is to remotely control a plurality of multimedia players in a multi-zone system, playing and controlling the audio source synchronously if the players are grouped together, or playing and controlling the audio source individually if the players are disassociated with each other.

Other objects, features, and advantages of the present invention will become apparent upon examining the following detailed description of an embodiment thereof, taken in conjunction with the attached drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:

FIG. 1 shows an exemplary configuration in which the present invention may be practiced;

FIG. 2A shows an exemplary functional block diagram of a player in accordance with the present invention;

FIG. 2B shows an example of a controller that may be used to remotely control one of more players of FIG. 2A;

FIG. 20 shows an exemplary internal functional block diagram of a controller in accordance with one embodiment of the present invention;

FIG. 3A provides an illustration of one zone scene, where the left column shows the starting zone grouping--all zones are separate, the column on the right shows the effects of grouping the zones to make a group of 3 zones named after "Morning";

FIG. 3B shows that a user defines multiple groups to be gathered at the same time;

FIG. 3C shows an exemplary user interface (UI) of individual zones in a house;

FIG. 3D shows a user interface as a result of the use activating "link zones" of FIG. 3C,

FIG. 3E shows a user interface after the user has selected some of the available zone players into the scene;

FIG. 4 shows an exemplary user interface that may be displayed on a controller or a computer of FIG. 1;

FIG. 5A shows another user interface to allow a user to form a scene;

FIG. 5B shows still another user interface to allow a user to form a scene;

FIG. 5C shows a user interface to allow a user to adjust a volume level of the zone players in a zone scene individually or collectively; and

FIG. 6 shows a flowchart or process of providing a player theme or a zone scene for a plurality of players, where one or more of the players are placed in a zone.

FIGS. 7A and 7B illustrate a sequence of screen displays in accordance with one embodiment of the present invention for controlling a plurality of players;

FIG. 7C shows a sequence of screen displays in accordance with one embodiment of the present invention for alternatively controlling players;

FIGS. 8A and 8B show a sequence of screen displays in accordance with one embodiment of the present invention for controlling players regarding audio volume;

FIG. 9 shows a flowchart or process of controlling a plurality of zones players according to one embodiment of the present invention;

FIG. 10 shows a flowchart or process of controlling audio volume of a plurality of players in a zone group according to one embodiment of the present invention; and

FIGS. 11A-11D show a sequence of screen displays in accordance with one embodiment of the present invention on a computing device for alternatively controlling players.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

The detailed description of the invention is presented largely in terms of procedures, steps, logic blocks, processing, and other symbolic representations that directly or indirectly resemble the operations of data processing devices coupled to networks. These process descriptions and representations are typically used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art. Numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will become obvious to those skilled in the art that the present invention may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuitry have not been described in detail to avoid unnecessarily obscuring aspects of the present invention.

Reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Further, the order of blocks in process flowcharts or diagrams representing one or more embodiments of the invention do not inherently indicate any particular order nor imply any limitations in the invention.

Referring now to the drawings, in which like numerals refer to like parts throughout the several views. FIG. 1 shows an exemplary configuration 100 in which the present invention may be practiced. The configuration may represent, but not be limited to, a part of a residential home, a business building or a complex with multiple zones. There are a number of multimedia players of which three examples 102, 104 and 106 are shown as audio devices. Each of the audio devices may be installed or provided in one particular area or zone and hence referred to as a zone player herein.

As used herein, unless explicitly stated otherwise, an audio source or audio sources are in digital format and can be transported or streamed over a data network. To facilitate the understanding of the present invention, it is assumed that the configuration 100 represents a home. Thus, the zone player 102 and 104 may be located in two of the bedrooms respectively while the zone player 106 may be installed in a living room. All of the zone players 102, 104 and 106 are coupled directly or indirectly to a data network 108. In addition, a computing device 110 is shown to be coupled on the network 108. In reality, any other devices such as a home gateway device, a storage device, or an MP3 player may be coupled to the network 108 as well.

The network 108 may be a wired network, a wireless network or a combination of both. In one example, all devices including the zone players 102, 104 and 106 are coupled to the network 108 by wireless means based on an industry standard such as IEEE 802.11. In yet another example, all devices including the zone players 102, 104 and 106 are part of a local area network that communicates with a wide area network (e.g., the Internet).

Many devices on the network 108 are configured to download and store audio sources. For example, the computing device 110 can download audio sources from the Internet and store the downloaded sources locally for sharing with other devices on the Internet or the network 108. The computing device 110 or any of the zone players can also be configured to receive streaming audio. Shown as a stereo system, the device 112 is configured to receive an analog audio source (e.g., from broadcasting) or retrieve a digital audio source (e.g., from a compact disk). The analog audio sources can be converted to digital audio sources. In accordance with the present invention, the audio source may be shared among the devices on the network 108.

Two or more zone players may be grouped together to form a new zone group. Any combinations of zone players and an existing zone group may be grouped together. In one instance, a new zone group is formed by adding one zone player to another zone player or an existing zone group.

Referring now to FIG. 2A, there is shown an exemplary functional block diagram of a zone player 200 in accordance with the present invention. The zone player 200 includes a network interface 202, a processor 204, a memory 206, an audio processing circuit 210, a module 212, and optionally, an audio amplifier 214 that may be internal or external. The network interface 202 facilitates a data flow between a data network (i.e., the data network 108 of FIG. 1) and the zone player 200 and typically executes a special set of rules (i.e., a protocol) to send data back and forth. One of the common protocols used in the Internet is TCP/IP (Transmission Control Protocol/Internet Protocol). In general, a network interface manages the assembling of an audio source or file into smaller packets that are transmitted over the data network or reassembles received packets into the original source or file. In addition, the network interface 202 handles the address part of each packet so that it gets to the right destination or intercepts packets destined for the zone player 200.

The network interface 202 may include one or both of a wireless interface 216 and a wired interface 217. The wireless interface 216, also referred to as a RF interface, provides network interface functions by a wireless means for the zone player 200 to communicate with other devices in accordance with a communication protocol (such as the wireless standard IEEE 802.11a, 802.11b or 802.11g). The wired interface 217 provides network interface functions by a wired means (e.g., an Ethernet cable). In one embodiment, a zone player includes both of the interfaces 216 and 217, and other zone players include only a RF or wired interface. Thus these other zone players communicate with other devices on a network or retrieve audio sources via the zone player. The processor 204 is configured to control the operation of other parts in the zone player 200. The memory 206 may be loaded with one or more software modules that can be executed by the processor 204 to achieve desired tasks. According to one aspect of the present invention, a software module implementing one embodiment of the present invention is executed, the processor 204 operates in accordance with the software module in reference to a saved zone group configuration characterizing a zone group created by a user, the zone player 200 is caused to retrieve an audio source from another zone player or a device on the network.

According to one embodiment of the present invention, the memory 206 is used to save one or more saved zone configuration files that may be retrieved for modification at any time. Typically, a saved zone group configuration file is transmitted to a controller (e.g., the controlling device 140 or 142 of FIG. 1, a computer, a portable device, or a TV) when a user operates the controlling device. The zone group configuration provides an interactive user interface so that various manipulations or control of the zone players may be performed.

The audio processing circuit 210 resembles most of the circuitry in an audio playback device and includes one or more digital-to-analog converters (DAC), an audio preprocessing part, an audio enhancement part or a digital signal processor and others. In operation, when an audio source is retrieved via the network interface 202, the audio source is processed in the audio processing circuit 210 to produce analog audio signals. The processed analog audio signals are then provided to the audio amplifier 214 for playback on speakers. In addition, the audio processing circuit 210 may include necessary circuitry to process analog signals as inputs to produce digital signals for sharing with other devices on a network.

Depending on an exact implementation, the module 212 may be implemented as a combination of hardware and software. In one embodiment, the module 212 is used to save a scene. The audio amplifier 214 is typically an analog circuit that powers the provided analog audio signals to drive one or more speakers.

Referring now to FIG. 2B, there is shown an exemplary controller 240, which may correspond to the controlling device 140 or 142 of FIG. 1. The controller 240 may be used to facilitate the control of multi-media applications, automation and others in a complex. In particular, the controller 240 is configured to facilitate a selection of a plurality of audio sources available on the network, controlling operations of one or more zone players (e.g., the zone player 200) through a RF interface corresponding to the RF interface 216 of FIG. 2A. According to one embodiment, the wireless means is based on an industry standard (e.g., infrared, radio, wireless standard IEEE 802.11a, 802.11b or 802.11g). When a particular audio source is being played in the zone player 200, a picture, if there is any, associated with the audio source may be transmitted from the zone player 200 to the controller 240 for display. In one embodiment, the controller 240 is used to synchronize more than one zone players by grouping the zone players. In another embodiment, the controller 240 is used to control the volume of each of the zone players in a zone group individually or together.

The user interface for the controller 240 includes a screen 242 (e.g., a LCD screen) and a set of functional buttons as follows: a "zones" button 244, a "back" button 246, a "music" button 248, a scroll wheel 250, "ok" button 252, a set of transport control buttons 254, a mute button 262, a volume up/down button 264, a set of soft buttons 266 corresponding to the labels 268 displayed on the screen 242.

The screen 242 displays various screen menus in response to a user's selection. In one embodiment, the "zones" button 244 activates a zone management screen or "Zone Menu", which is described in more details below. The "back" button 246 may lead to different actions depending on the current screen. In one embodiment, the "back" button triggers the current screen display to go back to a previous one. In another embodiment, the "back" button negates the user's erroneous selection. The "music" button 248 activates a music menu, which allows the selection of an audio source (e.g., a song) to be added to a zone player's music queue for playback.

The scroll wheel 250 is used for selecting an item within a list, whenever a list is presented on the screen 242. When the items in the list are too many to be accommodated in one screen display, a scroll indicator such as a scroll bar or a scroll arrow is displayed beside the list. When the scroll indicator is displayed, a user may rotate the scroll wheel 250 to either choose a displayed item or display a hidden item in the list. The "ok" button 252 is used to confirm the user selection on the screen 242.

There are three transport buttons 254, which are used to control the effect of the currently playing song. For example, the functions of the transport buttons may include play/pause and forward/rewind a song, move forward to a next song track, or move backward to a previous track. According to one embodiment, pressing one of the volume control buttons such as the mute button 262 or the volume up/down button 264 activates a volume panel. In addition, there are three soft buttons 266 that can be activated in accordance with the labels 268 on the screen 242. It can be understood that, in a multi-zone system, there may be multiple audio sources being played respectively in more than one zone players. The music transport functions described herein shall apply selectively to one of the sources when a corresponding one of the zone players or zone groups is selected.

FIG. 2C illustrates an internal functional block diagram of an exemplary controller 270, which may correspond to the controller 240 of FIG. 2B. The screen 272 on the controller 270 may be a LCD screen. The screen 272 communicates with and is commanded by a screen driver 274 that is controlled by a microcontroller (e.g., a processor) 276. The memory 282 may be loaded with one or more application modules 284 that can be executed by the microcontroller 276 with or without a user input via the user interface 278 to achieve desired tasks. In one embodiment, an application module is configured to facilitate grouping a number of selected zone players into a zone group and synchronizing the zone players for one audio source. In another embodiment, an application module is configured to control together the audio volumes of the zone players in a zone group. In operation, when the microcontroller 276 executes one of the application modules 284, the screen driver 274 generates control signals to drive the screen 272 to display an application specific user interface accordingly, more of which will be described below.

The controller 270 includes a network interface 280 referred to as a RF interface 280 that facilitates wireless communication with a zone player via a corresponding RF interface thereof. In one embodiment, the commands such as volume control and audio playback synchronization are sent via the RF interfaces. In another embodiment, a saved zone group configuration is transmitted between a zone player and a controller via the RF interfaces. The controller 270 may control one or more zone players, such as 102, 104 and 106 of FIG. 1. Nevertheless, there may be more than one controllers, each preferably in a zone (e.g., a room) and configured to control any one and all of the zone players.

In one embodiment, a user creates a zone group including at least two zone players from the controller 240 that sends signals or data to one of the zone players. As all the zone players are coupled on a network, the received signals in one zone player can cause other zone players in the group to be synchronized so that all the zone players in the group playback an identical audio source or a list of identical audio sources in a timely synchronized manner. Similarly, when a user increases the audio volume of the group from the controller, the signals or data of increasing the audio volume for the group are sent to one of the zone players and causes other zone players in the group to be increased together in volume and in scale.

According to one implementation, an application module is loaded in memory 282 for zone group management. When a predetermined key (e.g. the "zones" button 244) is activated on the controller 240, the application module is executed in the microcontroller 276. The input interface 278 coupled to and controlled by the microcontroller 276 receives inputs from a user. A "Zone Menu" is then displayed on the screen 272. The user may start grouping zone players into a zone group by activating a "Link Zones" or "Add Zone" soft button, or de-grouping a zone group by activating an "Unlink Zones" or "Drop Zone" button. The detail of the zone group manipulation will be further discussed below.

As described above, the input interface 278 includes a number of function buttons as well as a screen graphical user interface. It should be pointed out that the controller 240 in FIG. 2B is not the only controlling device that may practice the present invention. Other devices that provide the equivalent control functions (e.g., a computing device, a hand-held device) may also be configured to practice the present invention. In the above description, unless otherwise specifically described, it is clear that keys or buttons are generally referred to as either the physical buttons or soft buttons, enabling a user to enter a command or data.

One mechanism for `joining` zone players together for music playback is to link a number of zone players together to form a group. To link a number of zone players together, a user may manually link each zone player or room one after the other. For example, there is a multi-zone system that includes the following zones.

Bathroom

Bedroom

Den

Dining Room

Family Room

Foyer

If the user wishes to link 5 of the 6 zone players using the current mechanism, he/she must start with a single zone and then manually link each zone to that zone. This mechanism may be sometimes quite time consuming. According to one embodiment, a set of zones can be dynamically linked together using one command. Using what is referred to herein as a theme or a zone scene, zones can be configured in a particular scene (e.g., morning, afternoon, or garden), where a predefined zone grouping and setting of attributes for the grouping are automatically effectuated.

For instance, a "Morning" zone scene/configuration command would link the Bedroom, Den and Dining Room together in one action. Without this single command, the user would need to manually and individually link each zone. FIG. 3A provides an illustration of one zone scene, where the left column shows the starting zone grouping--all zones are separate, the column on the right shows the effects of grouping the zones to make a group of 3 zones named after "Morning".

Expanding this idea further, a Zone Scene can be set to create multiple sets of linked zones. For example, a scene creates 3 separate groups of zones, the downstairs zones would be linked together, the upstairs zones would be linked together in their own group, and the outside zones (in this case the patio) would move into a group of its own.

In one embodiment as shown in FIG. 3B, a user defines multiple groups to be gathered at the same time. For example: an "Evening Scene" is desired to link the following zones:

Group 1

Bedroom

Den

Dining Room

Group 2

Garage

Garden

where Bathroom, Family Room and Foyer should be separated from any group if they were part of a group before the Zone Scene was invoked.

One of the important features, benefits and objects in the present invention is that that zones do not need to be separated before a zone scene is invoked. In one embodiment, a command is provided and links all zones in one step, if invoked. The command is in a form of a zone scene. After linking the appropriate zones, a zone scene command could apply the following attributes:

Set volumes levels in each zones (each zone can have a different volume)

Mute/Unmute zones.

Select and play specific music in the zones.

Set the play mode of the music (Shuffle, Repeat, Shuffle-repeat)

Set the music playback equalization of each zone (e.g., bass treble).

A further extension of this embodiment is to trigger a zone scene command as an alarm clock function. For instance the zone scene is set to apply at 8:00 am. It could link appropriate zones automatically, set specific music to play and then stop the music after a defined duration. Although a single zone may be assigned to an alarm, a scene set as an alarm clock provides a synchronized alarm, allowing any zones linked in the scene to play a predefined audio (e.g., a favorable song, a predefined playlist) at a specific time or for a specific duration. If, for any reason, the scheduled music failed to be played (e.g., an empty playlist, no connection to a share, failed UPnP, no Internet connection for an Internet Radio station), a backup buzzer will sound. This buzzer will be a sound file that is stored in a zone player.

FIG. 3C shows an exemplary user interface (UI) 330 to show all available individual zones in a house. Each zone player can play a type of media (such as music, photographs and video) independently. Each zone player in the UI may be highlighted on the screen using either a touch screen or an input device such as a stylus, a scroll wheel, or arrow buttons. If a user wishes to link players in some rooms together to form a group so that players in these rooms are playing the same media in a synchronized fashion, the user may activate the grouping function by activating "link zones" 332 that leads to a user interface 340 as shown in FIG. 3D.

The UI 340 shows that the zone players available for grouping are selectable. In one embodiment, the UI 340 is displayed (e.g., a touch screen) to allow the user to choose what zone players to be included in a group named after "Bedroom" so that they are all playing the same song "The Beatles". It should be noted that the user may have an option to name the scene, for example, "afternoon", or "light music". In the example shown in FIGS. 3C and 3D, the user selects the Bedroom zone and then the "Link Zones" button 332, as a result, the default name for the scene being created is named after "Bedroom". As shown FIG. 3D, a zone player may be selected or highlighted by "checking" it into the group. In another embodiment, the selection action could also be achieved through pressing the "+-" icon next to each zone.

FIG. 3E shows a user interface 350 after the user has selected some of the available zone players into the scene. The display 350 is so displayed that a user can easily tell a group of linked players from the isolated players. According to one embodiment, a display may be provided to visually tell a user what have been grouped and what are not grouped. The display may even show various groups by size to indicate a number of zone players in each of the groups, for example, the larger a group appears, the more zone players there are in the group.

In general, all players in a group are caused to play the media being played in the first member used to form the group. In the case of FIG. 3E, the zone player in the bedroom is used to initiate the group or the first one in the group. At the time of forming the group, the zone player in the bedroom is playing "the Beatles", as soon as a second zone player joins the group, the second zone player starts to be synchronized with the one already in the group and thus to play "the Beatles" in this case. As will be described below, the user now can switch the group of players to any other type of media or a different piece of music and all of the zone players in the group will play the selected media at the same time.

FIG. 4 shows an exemplary user interface 400 that may be displayed on a controller 142 or a computer 110 of FIG. 1. The interface 400 shows a list of items that may be set up by a user to cause a scene to function at a specific time. In the embodiment shown in FIG. 4, the list of items includes "Alarm", "Time", "Zone", "Music", "Frequency" and "Alarm length". "Alarm" can be set on or off. When "Alarm" is set on, "Time" is a specific time to set off the alarm. "Zone" shows which zone players are being set to play a specified audio at the specific time. "Music" shows what to be played when the specific time arrives. "Frequency" allows the user to define a frequency of the alarm. "Alarm length" defines how long the audio is to be played. It should be noted that the user interface 400 is provided herein to show some of the functions associated with setting up an alarm. Depending on an exact implementation, other functions, such as time zone, daylight savings, time synchronization, and time/date format for display may also be provided without departing from the present invention.

According to one embodiment, each zone player in a scene may be set up for different alarms. For example, a "Morning" scene includes three zone players, each in a bedroom, a den, and a dining room. After selecting the scene, the user may set up an alarm for the scene as whole. As a result, each of the zone players will be activated at a specific time.

FIG. 5A shows a user interface 500 to allow a user to form a scene. The panel on the left shows the available zones in a household. The panel on the right shows the zones that have been selected and be grouped as part of this scene. Depending on an exact implementation of a user interface, Add/Remove buttons may be provided to move zones between the panels, or zones may be dragged along between panels.

FIG. 5B shows another user interface 520 to allow a user to form a scene. The user interface 520 that may be displayed on a controller or a computing device, lists available zones in a system. A checkbox is provided next to each of the zones so that a user may check in the zones to be associated with the scene.

FIG. 5C shows a user interface 510 to allow a user to adjust a volume level of the zone players in a zone scene individually or collectively. As shown in the user interface 510, the `Volumes . . . ` button (shown as sliders, other forms are possible) allows the user to affect the volumes of the associated zone players when a zone scene is invoked. In one embodiment, the zone players can be set to retain whatever volume that they currently have when the scene is invoked. Additionally the user can decide if the volumes should be unmuted or muted when the scene is invoked.

FIG. 6 shows a flowchart or process 600 of providing a player theme or a zone scene for a plurality of players, where one or more of the players are placed in a zone. The process 600 is presented in accordance with one embodiment of the present invention and may be implemented in a module to be located in the memory 282 of FIG. 2C.

The process 600 is initiated only when a user decides to proceed with a zone scene at 602. The process 600 then moves to 604 where it allows a user to decide which zone players to be associated with the scene. For example, there are ten players in a household, and the scene is named after "Morning". The user may be given an interface to select four of the ten players to be associated with the scene. At 606, the scene is saved. The scene may be saved in any one of the members in the scene. In the example of FIG. 1, the scene is saved in one of the zone players and displayed on the controller 142. In operation, a set of data pertaining to the scene includes a plurality of parameters. In one embodiment, the parameters include, but may not be limited to, identifiers (e.g., IP address) of the associated players and a playlist. The parameters may also include volume/tone settings for the associated players in the scene. The user may go back to 602 to configure another scene if desired.

Given a saved scene, a user may activate the scene at any time or set up a timer to activate the scene at 610. The process 600 can continue when a saved scene is activated at 610. At 612, upon the activation of a saved scene, the process 600 checks the status of the players associated with the scene. The status of the players means that each of the players shall be in condition to react in a synchronized manner. In one embodiment, the interconnections of the players are checked to make sure that the players communicate among themselves and/or with a controller if there is such a controller in the scene.

It is assumed that all players associated with the scene are in good condition. At 614, commands are executed with the parameters (e.g., pertaining to a playlist and volumes). In one embodiment, data including the parameters is transported from a member (e.g., a controller) to other members in the scene so that the players are caused to synchronize an operation configured in the scene. The operation may cause all players to play back a song in identical or different volumes or to play back a pre-stored file.

One of the features, benefits and advantages in the present invention is to allow sets of related devices (controllers and operating components) to exist as a group without interfering with other components that are potentially visible on the same wired or wireless network. Each of the sets is configured to a theme or a scene.

FIGS. 7A and 7B illustrate a sequence of screens in accordance with one embodiment of the present invention for manipulating a plurality of zone players for an exemplary four-zone distributed audio system. There are four zone players in four zones and referred to as: "Zone 1", "Zone 2", "Zone 3" and "Zone 4".

FIG. 7A shows a grouping process. A first "Zone Menu" 702 shows a first list 703 of available zone players. One of the zone players or existing zone group is selected as a zone group head 704, which is indicated with uniformly highlighted texts. It is noted the highlighted texts may also be expressed as grouped icons, concatenated texts or other representations of a current selection on the screen. The highlighted texts react to a user's scrolling selection (e.g., via a scroll wheel 250 of FIG. 2B). Also shown as one of the bottom labels on the first "Zone Menu" 702 is the "Link Zones" or "Add Zone" label 706 that corresponds to a soft button (e.g., soft button 266 of FIG. 2B). A scroll indicator is displayed beside the first display 702 when the number of items is too many to be accommodated in one display.

When the soft button corresponding to "Link Zones" or "Add Zone" 706 is activated, a second "Zone Menu" 708 is displayed. A second list shows eligible zone groups or zone players 709 for the zone group head 704. Since "Zone 2" has been selected as the zone group head to form a zone group, the eligible zone groups and zone players are now "Zone 1", "Zone 3" and "Zone 4". As shown as highlighted texts, the zone player (Zone 4) 710 is selected to be grouped with the zone group head (Zone 2) 704 to form a new zone group.

After the user confirms the selection, the newly formed zone group configuration is updated and the audio source can be played synchronously for all of the zone players in the newly formed zone group as shown in FIG. 7A. The first "Zone menu" 712 is displayed again with the newly formed zone group 714 as one of the choices along with other available zone players. Depending on implementation, any zone players that have been used in a group may or may not be used in another group. As shown in FIG. 7A, the zone players 2 and 4 are in a zone group started by the zone player 2, the zone player 4 (even the zone player 2) can be in another zone group, for example, while a player in a living room is grouped with a player in a dinning room, the same player in the living room can be grouped with a player in a family room.

In another embodiment, a display shows a list of available zone players for grouping. An interactive graphic interface allows a user to interactively select some of the available zone players that are automatically grouped. Anyone of the selected zone players in the group may be elected to be a group head such that other players in the group are synchronized to follow the group head.

According to one embodiment, the synchronization of all zone players in the new zone group is achieved with the following steps: 1) choosing an audio source from one of the zone players in the group, 2) checking if the chosen audio source is available locally on each of the zone players, 3) retrieving the audio source from another device (e.g., other zone players) which has the audio source via the data network, if the audio source is not available locally, and 4) playing the audio source on each of the zone players synchronously. In another embodiment, the audio source in the group or player to be added is chosen as default for other zone players. As shown in FIG. 7A, the audio source for "Zone 4" is track 10 with artist D. Accordingly, the zone player "Zone 2" will play track 10 with artist D synchronously with "Zone 4" after the new zone group is formed.

FIG. 7B shows a de-grouping process in reference to FIG. 7A. A first "Zone Menu" 722 shows a first list 723 of available zone players and zone groups, if there is any. One of the zone groups ("Zone 2+Zone 4") to be de-grouped is selected as shown in highlight texts 724. When the "Unlink Zones" or "Drop Zone" soft button 726 is activated, the second "Zone Menu" 728 displays a list 729 that shows all of the member zone players in the selected zone group. One of the zone players 730 (Zone 4) in the selected zone group is chosen to be disassociated from the zone group 724. When the de-grouping is confirmed by user, the "Zone Menu" 732 is presented to reflect the de-grouping of a zone group (e.g., "Zone 2") 734. When zone players are grouped together, all of the zone players play the same audio source synchronously. If a zone player is disassociated or dropped from the zone group, the remaining zone players in the zone group continue playing the audio source. In the "Zone Menu" 732, it shows the zone player (Zone 4) has no music 736 after the disassociation from the zone group, while the remaining player (Zone 2) continues to play the same music--track 10 with artist D.

Referring now to FIG. 7C, there shows a sequence of screens depicting alternative steps of creating a zone group. An exemplary five-zone audio system is used to describe these alternative steps. There are five zone players located in a living room, a dining room, a kitchen, an office and a master bedroom. It is assumed that three of the five rooms, the living room, the dining room and the kitchen, are grouped to form a zone group called "LivingRoom+DiningRoom+Kitchen". The Screen display 750 shows that an audio source called "Counting Crows" 752 is being played on all the zone players in the group. When a user activates the "music" button 248 on a controller 240 of FIG. 2B, the screen display 760 shows a "Music Menu" page which shows a list of choices 762. One of the choices is "Play Music From Other Rooms" 764. When the user selects this option, the screen display 770 for "Play Same Music As Other Rooms" displays a list 772 of eligible rooms or zone players to be grouped with the current group "Living Room+Dining Room+Kitchen". In this example the eligible rooms are Office 774 and Master Bedroom 776. It is assumed that the "Office" 774 which is indicated with the highlighted bar is chosen. As a result, the zone player "Office" is grouped with the original zone group to become a new group called "LivingRoom+DiningRoom+Kitchen+Office" as shown in screen display 780. And the audio source "Miles Davis" 782 from the zone player "Office" is played synchronously on all zone players in the new group.

Referring now to FIGS. 8A and 8B, there is shown a sequence of screens in accordance with one embodiment of the present invention for controlling audio volume of zone players in a zone group. These screens are activated and displayed when one of the volume control buttons is activated. According to one embodiment, the volume control buttons are "mute" button 262 and "volume up/down" button 264 on the controller 240 of FIG. 2B.

FIG. 8A shows that the current active zone player is in a living room of a house. The "Volume" panel 810 is displayed for the current zone player "Living Room". A volume meter 812 is included to indicate a volume adjustment made by a user. A mute icon 814 is shown when the "mute" button is activated while the audio is on.

FIG. 8B shows that the current active zone group includes five rooms: living room, dining room, kitchen, den and study. A "Volume" panel 830 for the zone group is displayed for the convenience of a user. In the display, a plurality of volume meters 831 is shown. One of the volume meters is for the entire zone group 832. Other volume meters are for all the zone players in the group, one for each room or zone player. A scroll indicator is shown as a downward arrow 836 that indicates the screen is too small to hold all volume meters in one screen display. There are more hidden choices that can be viewed by scrolling down. The scrolling cursor 836 is highlighted (e.g., Den 834). As a user scrolls down the list of volume meters 831, the contents on "Volume" panel 840 includes the volume meter of the next zone player on the list (e.g., Study 844). Similarly, when the scroll indicator is an upward arrow 842, other hidden choices within the list can be viewed by scrolling up.

When a user adjusts the audio volume, only the highlighted zones or zone players are affected. If the highlighted selection is one zone player, the audio adjustment will only apply to that particular chosen player. If multiple zone players are selected, the adjustment applies to all of the chosen players similar to the volume adjustment to the group volume meter described below. The audio volume of all zone players in the zone group will be affected, if the highlighted selection is at the volume meter of the entire group 832. Any audio volume adjustment to the zone group applies to all of the zone players equally within the entire zone group. Depending on implementation, the relative difference of the audio volume among zone players in the group remains unchanged either in percentage or graphic strength.

FIG. 9 shows a flowchart or process 900 of implementing one embodiment of the present invention for manipulating zone players. The process 900, which is preferably understood in conjunction with the previous figures especially with FIGS. 2B, 2C, 7A and 7B, may be implemented in software, hardware, or a combination of both. According to one embodiment, an application module implementing the process 900 is embedded in a controller, for example, the device 240 of FIG. 2C. The module may be loaded in the memory 282 to be executed by the microcontroller (processor) 276 and operating in conjunction with user input commands via the input interface 278.

The process 900 starts with a display at 902 showing a list of zone players or existing zone groups, if there are any. When the available zone players and zone groups in the list is too long to be presented on the display, a scroll indicator will be displayed beside the list. A user may access the hidden items within the list by scrolling either upward or downward. At 903, the process 900 splits into two branches based on the following tasks: 1) grouping a plurality of zone players to form a zone group, or 2) de-grouping a zone group.

If the process 900 performs the grouping task, the process moves onto 904. The user selects one of the zone players as a zone group head or the zone groups from the list. Once the selection is made and a key is activated (e.g., a soft button 706 as shown in FIG. 7A), a new list (e.g., screen 708 as shown in FIG. 7A) showing all of the zone players and zone groups that are eligible to be grouped with the selection at 904 is displayed at 908. In one embodiment as shown in FIG. 7A, when the zone player "Zone 2" is selected as the zone group head in a four-zone audio system, the eligible zone players are all other zone players ("Zone 1", "Zone 3" and "Zone 4") except for the zone player "Zone 2". At 910, the end user is then given the option to select one or more of the eligible zone players (e.g., "Zone 4" in FIG. 7A) or one of the eligible zone groups to be grouped into the selection at 904.

At 912, the user has the option to confirm to accept or to discard the selection made at 910. When a confirmation is made, the process 900 creates the zone group by synchronizing all of the zone players in the zone group at 914. In one embodiment, the synchronization is performed first to determine the audio source in the selected player to be grouped (e.g., Zone 4 in FIG. 7A). Then the audio source is transmitted to all other zone players in the same zone group before playing the audio source synchronously. In the meantime, this newly formed zone group configuration is updated. For example, the zone group configuration is saved to the memory 206 on the zone player 200 of FIG. 2A via the wireless communication. After the zone player in the newly formed zone group has been synchronized, the process 900 moves back to 902. An updated list of available zone players and zone groups is displayed. As an example shown in "Zone Menu" 712 of FIG. 7A, the newly formed zone group is listed as one of the items.

Going back to 912, if the selection made at 910 can not be confirmed or is to be discarded, the process 900 goes back to 902 without updating any zone group configuration. In this case, the original list is intact (e.g., "Zone Menu" 702 of FIG. 7A).

Going back to the grouping task test at 903, when the process 900 performs the de-grouping task, the process 900 moves to 924. A user selects one zone group from the first list. Once the selection is made, a list (e.g., screen 728 as shown in FIG. 7B) showing all of the zone players in the selected zone group is displayed at 928. In one embodiment as shown in FIG. 7B, "Zone 2+Zone 4" is the selected zone group, the zone players in the selected zone group are "Zone 2" and "Zone 4". At 930, the end user selects one or more of the zone players (e.g., Zone 4) from the list to be disassociated from the zone group. At 932, the user has the option to confirm or to discard the selection made at 930.

When the selection is confirmed, the process 900 updates the selected zone group by disassociating the zone player from the zone group at 934. As a result of the disassociation, the audio source being played in the zone group is no longer available to the disassociated zone player. In the meantime, the updated zone group configuration is saved. For example, the zone configuration is saved to the memory 206 on the zone player 200 (FIG. 2A). After the zone player has been disassociated, the process 900 moves back to 902. The list of available zone players and zone groups is displayed. This time the newly updated zone group is listed as one of the items (e.g., "Zone Menu" 732 of FIG. 7B).

Going back to 932, if the selection made at 930 is to be discarded, the process 900 moves back to 902 directly without updating the zone group configuration. In this case, the original list is presented (e.g., "Zone Menu" 722 of FIG. 7B).

Referring now to FIG. 10, there is shown a flowchart or process 1000 of controlling audio volume of a plurality of zone players in a zone group. The process 1000, which is preferably understood in conjunction with the previous figures especially with FIGS. 2B, 2C, 8A and 8B, may be implemented in software, hardware, or a combination of both. According to one embodiment, an application module implementing the process 1000 is embedded in a controller, for example, the device 240 of FIG. 2C. The module may be loaded in the memory 282 to be executed by the microcontroller (processor) 276 and operating in conjunction with user input commands. In one embodiment, the module is configured to control the audio volume of a group of zone players. It should be noted that zone group and a group of zone players are used interchangeably in the description for FIG. 10.

At 1005, the process 1000 starts when one of the volume control buttons, for example, "mute" button 262 or "volume up/down" button 264, on the controller 240 of FIG. 2B, is activated. The process 1000 splits into two branches depending on whether a single zone player or a group of zone players is to be controlled at 1010.

If it is for a single zone player, a volume meter (e.g., "Volume" panel 810 in FIG. 8A) is presented at 1012. At 1014, the end user has option to adjust the volume for the zone player with one of the volume control buttons. The volume control signals are sent from the controller 240 (FIG. 2C) to the zone player 200 (FIG. 2A). In one embodiment, the volume panel displays a moving volume meter showing an increasing or decreasing bar as the audio volume of the selected zone player is adjusted up or down. In another embodiment, the mute icon is shown instead of a volume meter, when the "mute" button is activated while the audio is on.

At 1016, the process 1000 is waiting for a user's command. If a predetermined amount of time (e.g., 1 second) has lapsed, the process 1000 ends, which means the user is not going to change the volume. Otherwise, the process 1000 goes back to 1012 waiting for another action from the end user.

Referring back to 1010, if process 1000 is for a group of zone players, then the process 1000 moves to the zone group branch at 1022, in which a plurality of volume meters is presented. The plurality of volume meters includes one for each of the zone players in the zone group, plus one more for the entire zone group. In one embodiment as shown in FIG. 8B, a "Volume" panel 830 for a plurality of zone players in a zone group is presented. When the screen is too small to display all the zone players, a scroll indicator is displayed beside the list of the volume meters to indicate that there are hidden volume meters. At 1024, the user selects one of the volume meters. At 1026, the audio volume is adjusted with one of the volume control buttons. When the volume adjustment is made to one of the zone players, only the selected zone player is affected. The audio volume of the rest of the zone players remains unchanged. However, when the volume adjustment is made to the zone group, the entire group will respond to the volume adjustment in an identical scale. In one embodiment, the identical scale is based on percentage of the audio volume. In another embodiment, the identical scale is based on graphic representation of the volume meter.

In one embodiment, an end user increases the audio volume for the zone group by 5%. The volume for each of the zone players in the zone group will be increased by 5%, and the relative volume loudness difference among each of the zone players remains unchanged. In another embodiment, if a user had muted one of the zone players of the zone group, the volume of all other zone players would have been unchanged.

The group audio volume is calculated based on a predetermined formula. In one embodiment, the group audio volume is the averaged value of the audio volume of all the zone players within the zone group. In another embodiment, the median value may be chosen as the group audio volume. The user uses a scrolling device (e.g., scroll wheel 250 of FIG. 2B) to select a zone player or the entire zone group and then adjust the volume with one of the volume control buttons.

At 1028, the process 1000 is waiting for a user's command. If a predetermined amount of time has lapsed, the process 1000 ends, which means the user is not going to change the volume of the zone group. Otherwise, the process 1000 goes back to 1022 waiting for another action from the end user.

FIGS. 11A-11D are a series of screenshots according to one embodiment of the present invention on a computing device on a network. The computing device may correspond to the device 110 of FIG. 1 and be configured to control operations of the zone players installed in a complex. With a larger screen of the computing device than that of a portable controller, the graphic user interface on the larger screen appears different from that, for example, shown in FIGS. 7A-7B, the underlying principle nevertheless does not depart from the above description for the portable controller. A user is able to control any one or all of the zone players from the computing device. FIG. 11A shows all individual available zone players with the one in the Dining Room being selected to play a track entitled "A Charlie Brown Christmas". FIG. 11B shows a pop-up window listing remaining available players to be grouped with the one in the Dining Room. Depending on a desirable group, the user can select from the remaining available players to be grouped with the one in the Dining Room to form a zone group. FIG. 11C shows, as a result, the players in the Dining Room and the living room are in one group and play synchronously a song entitled "Christmas time is here". FIG. 11D shows the control of some exemplary acoustic characteristics of the zone players in a group with the volume being controlled.

The present invention can be implemented in many ways, each of which may yield one or more of the following benefits, advantages or features. One of them is a mechanism provided to enable a user to remotely control audio characteristics of the zone players either as a group or as an individual player. Second, an interactive graphic user interface is provided to enable a user to manage, create, delete or modify zone groups. Another one of the benefits, advantages or features is to provide a user interface to facilitate a user to control audio characteristics of an individual zone player or a group of zone players. Other benefits, advantages or features can be appreciated by those skilled in the art given the detailed description herein.

The present invention has been described in sufficient detail with a certain degree of particularity. It is understood to those skilled in the art that the present disclosure of embodiments has been made by way of examples only and that numerous changes in the arrangement and combination of parts may be resorted without departing from the spirit and scope of the invention as claimed. While the embodiments discussed herein may appear to include some limitations as to the presentation of the information units, in terms of the format and arrangement, the invention has applicability well beyond such embodiment, which can be appreciated by those skilled in the art. Accordingly, the scope of the present invention is defined by the appended claims rather than the forgoing description of embodiments.

* * * * *

References

Patent Diagrams and Documents

D00000


D00001


D00002


D00003


D00004


D00005


D00006


D00007


D00008


D00009


D00010


D00011


D00012


D00013


D00014


D00015


D00016


D00017


D00018


D00019


D00020


D00021


D00022


D00023


D00024


XML


uspto.report is an independent third-party trademark research tool that is not affiliated, endorsed, or sponsored by the United States Patent and Trademark Office (USPTO) or any other governmental organization. The information provided by uspto.report is based on publicly available data at the time of writing and is intended for informational purposes only.

While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, reliability, or suitability of the information displayed on this site. The use of this site is at your own risk. Any reliance you place on such information is therefore strictly at your own risk.

All official trademark data, including owner information, should be verified by visiting the official USPTO website at www.uspto.gov. This site is not intended to replace professional legal advice and should not be used as a substitute for consulting with a legal professional who is knowledgeable about trademark law.

© 2024 USPTO.report | Privacy Policy | Resources | RSS Feed of Trademarks | Trademark Filings Twitter Feed