Jump to content

Matrox Parhelia: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Luckas-bot (talk | contribs)
m Sales: Corrected the last paragraph to say nVidia and ATI "controlled" rather than that they "control" the majority of the discrete graphics chip market, corrected nVidia to Nvidia in the same section
 
(41 intermediate revisions by 29 users not shown)
Line 1: Line 1:
{{short description|GPU by Matrox}}
[[Image:Matrox parhelia 128mb agp.jpg|thumb|250px|Parhelia AGP 128mb]]
[[File:Matrox Parhelia 256MB PCI.jpg|thumb|250px|Parhelia PCI-X 256MB]]
[[Image:Matrox parhelia 128mb agp.jpg|thumb|250px|Parhelia AGP 128 MB]]
[[File:Matrox Parhelia 256MB PCI.jpg|thumb|250px|Parhelia PCI-X 256 MB]]
'''Matrox Parhelia-512''' is a [[GPU]] with full support for [[DirectX]] 8.1 and incorporating several DirectX 9.0 features. It was best known for its ability to drive three monitors ("Surround Gaming") and its ''Coral Reef'' tech demo.

The '''Matrox Parhelia-512''' is a [[graphics processing unit]] (GPU) released by [[Matrox]] in 2002. It has full support for [[DirectX]] 8.1 and incorporates several DirectX 9.0 features. At the time of its release, it was best known for its ability to drive three monitors ("Surround Gaming") and its ''Coral Reef'' tech demo.

As had happened with previous Matrox products, the Parhelia was released just before competing companies released cards that completely outperformed it. In this case it was the [[Radeon R300|ATI Radeon 9700]], released only a few months later. The Parhelia remained a niche product, and was Matrox's last major effort to sell into the consumer market.


==Background==
==Background==
The Parhelia series was [[Matrox]]'s attempt to return to the market after a long hiatus, their first significant effort since the [[Matrox G200|G200]] and [[Matrox G400|G400]] lines had become uncompetitive. Their other post-''G400'' products, [[Matrox G450|G450]] and G550, were cost-reduced revisions of "G400" technology and were not competitive with [[Radeon|ATI's Radeon]] or [[GeForce|NVIDIA's GeForce]] lines with regards to [[3D computer graphics]].
The Parhelia series was Matrox's attempt to return to the market after a long hiatus, their first significant effort since the [[Matrox G200|G200]] and [[Matrox G400|G400]] lines had become uncompetitive. Their other post-''G400'' products, G450 and G550, were cost-reduced revisions of ''G400'' technology and were not competitive with [[Radeon|ATI's Radeon]] or [[GeForce|NVIDIA's GeForce]] lines with regards to [[3D computer graphics]].


==Description==
==Description==
===Features===
===Features===
The Parhelia-512 was the first GPU by Matrox to be equipped with a 256-bit memory bus, giving it an advantage over other cards of the time in the area of memory bandwidth. The '-512' suffix refers to the 512-bit ring bus. The Parhelia processor featured Glyph acceleration, where anti-aliasing of text was accelerated by the hardware.
The Parhelia-512 was the first GPU by Matrox to be equipped with a 256-bit memory bus, giving it an advantage over other cards of the time in the area of memory bandwidth. The "-512" suffix refers to the 512-bit ring bus. The Parhelia processor featured Glyph acceleration, where anti-aliasing of text was accelerated by the hardware.


Parhelia-512 includes 4 32x4 [[vertex shader]]s with dedicated [[displacement mapping]] engine, pixel shader array with 4 texturing unit and 5-stage pixel shader per pixel pipeline. It supports 16x fragment [[anti-aliasing]], all of which were featured prominently in Matrox's ''Coral Reef'' technical demo.
Parhelia-512 includes 4 32×4 [[vertex shader]]s with dedicated [[displacement mapping]] engine, pixel shader array with 4 texturing unit and 5-stage pixel shader per pixel pipeline. It supports 16× fragment [[Spatial anti-aliasing|anti-aliasing]], all of which were featured prominently in Matrox's ''Coral Reef'' technical demo.


Display controller component supports 10-bit color frame buffer (called "Gigacolor") with 10-bit 400MHz RAMDACs on 2 RGB ports and 230MHz RAMDAC on TV encoder port, which was an improvement over its competitors. The frame buffer is in RGBA (10:10:10:2) format, and supports full gamma correction. Dual link TMDS is supported via external controller connected to the digital interface.
Display controller component supports 10-bit color frame buffer (called "Gigacolor") with 10-bit 400 MHz RAMDACs on 2 RGB ports and 230 MHz RAMDAC on TV encoder port, which was an improvement over its competitors. The frame buffer is in RGBA (10:10:10:2) format, and supports full gamma correction. Dual link TMDS is supported via external controller connected to the digital interface.


Memory controller supports 256-bit DDR SDRAM.
Memory controller supports 256-bit DDR SDRAM.
Line 19: Line 23:


===Video cards===
===Video cards===
The cards were released in 2002, simply called Matrox Parhelia, initially came with 128 or 256 MiB memory. Retail cards are clocked 220MHz core, 275MHz memory; OEM cards are clocked 200MHz core, 250MHz memory.<ref>[http://www.anandtech.com/showdoc.aspx?i=1645 Matrox's Parhelia - A Performance Paradox]</ref>
The cards were released in 2002, simply called Matrox Parhelia, initially came with 128 or 256&nbsp;MiB memory. Retail cards are clocked 220&nbsp;MHz core, 275&nbsp;MHz memory; OEM cards are clocked 200&nbsp;MHz core, 250&nbsp;MHz memory.<ref>[http://www.anandtech.com/showdoc.aspx?i=1645 Matrox's Parhelia - A Performance Paradox]</ref>


To further improve analog image quality, 5th order [[low-pass filters]] are used.
To further improve analog image quality, 5th order [[low-pass filters]] are used.
Line 25: Line 29:
===Performance===
===Performance===
[[File:Matrox Parhelia512-2.jpg|250px|thumb|Parhelia chip]]
[[File:Matrox Parhelia512-2.jpg|250px|thumb|Parhelia chip]]
For a top-of-the-line, and rather expensive card ($399 [[USD]]), the Matrox Parhelia's 3D gaming performance was well behind [[NVIDIA]]'s older and similarly priced [[GeForce 4|GeForce 4 Ti 4600]]. The Parhelia was only competitive with the older [[Radeon 8500]] and [[GeForce 3]], which typically cost half as much. The Parhelia's potent performance was held back by its comparatively low GPU clock speed, which in turn was initially believed to be limited by its large number of transistors. However, ATI's [[Radeon 9700]] was released later that year, with many more transistors (80 million vs. 108 million), all on the same 150&nbsp;nm [[chip fabrication]] process, yet it had a much higher clock speed (250MHz vs. 325MHz).
For a top-of-the-line, and rather expensive card ($399 [[USD]]), the Matrox Parhelia's 3D gaming performance was well behind [[NVIDIA]]'s older and similarly priced [[GeForce 4|GeForce 4 Ti 4600]]. The Parhelia was only competitive with the older [[Radeon 8500]] and [[GeForce 3]], which typically cost half as much. The Parhelia's potent performance was held back by its comparatively low GPU clock speed (220&nbsp;MHz for retail model, 200&nbsp;MHz for OEM and 256&nbsp;MB models), initially believed by many commentators to be due to the large (for that time-frame) transistor count. However, ATI's [[Radeon 9700]] was released later that year, with a considerably larger transistor count (108 million vs. 80 million), on the same 150&nbsp;nm [[chip fabrication]] process, yet managed a substantially higher clock (325&nbsp;MHz vs. 250&nbsp;MHz).
The card's [[fillrate]] performance was formidable{{citation needed|date=December 2023}} in games that used many texture layers; though equipped with just 4 pixel pipelines, each had 4 texture units. This proved not to be an efficient arrangement in most situations. Parhelia was also hampered by poor bandwidth conserving technologies/techniques; ATI introduced their 3rd gen [[HyperZ]] in Radeon 9700, NVIDIA touted [[Lightning Memory Architecture]] 2 for the GeForce 4 series, Matrox had no similarly comprehensive optimization approach. While the Parhelia possessed an impressive{{Fact or opinion|date=December 2023}} raw memory bandwidth much of it was wasted on invisible house-keeping tasks because the card lacked the ability to predict overdraw or compress z-buffer data, among other inefficiencies. Some writers believed Parhelia to have a "crippled" triangle-setup engine that starved the rest of the chip in typical 3D rendering tasks [https://web.archive.org/web/20030904210006/http://www.anandtech.com/video/showdoc.html?i=1656&p=6].


Later in Parhelia's life, when DirectX 9 applications were becoming quite prevalent, Matrox acknowledged that the vertex shaders were not Shader Model 2.0 capable, and as such not DirectX 9-compliant, as was initially advertised. Presumably there were several bugs within the Parhelia core that could not be worked around in the driver.{{speculation inline|date=December 2023}} However, it was all a bit of a moot point because Parhelia's performance was not adequate to drive most DirectX 9-supporting titles well even without more complex shader code weighing the card down.
The card's fillrate performance was only formidable if a game used many layers of textures because it was only equipped with 4 pixel pipelines, but each had 4 texture units. Unfortunately this did not turn out to be an optimal approach for games. Parhelia was also hampered by limited bandwidth saving technologies, while ATI had their 3rd generation [[HyperZ]] in Radeon 9700 and NVIDIA had their Lightning Memory Architecture 2 in GeForce 4. So, while the Parhelia had formidable memory bandwidth, much of it was wasted because the card didn't have the ability to efficiently prevent overdraw or compress z-buffer data, among other inefficiencies. Parhelia was also believed to have a crippled triangle-setup engine that starved the rest of the chip in typical 3D rendering tasks [http://web.archive.org/web/20030904210006/www.anandtech.com/video/showdoc.html?i=1656&p=6].

Later in Parhelia's life, when DirectX 9 applications were becoming quite prevalent, Matrox acknowledged that the vertex shaders were not Shader Model 2.0 capable, and as such not DirectX 9-compliant, as was initially advertised. Presumably there were several bugs within the Parhelia core that could not be worked around in the drivers. However, it was all a bit of a moot point because Parhelia's performance was not adequate to drive most DirectX 9-supporting titles well even without more complex shader code weighing the card down.


===Sales===
===Sales===
Despite the lackluster performance for its price, Matrox hoped to win over enthusiasts with the Parhelia's unique and high quality features, such as "Surround Gaming", glyph acceleration, high resolutions, and 16x fragment anti-aliasing. In these aspects, some reviewers suggested that Parhelia could have been a compelling alternative to the comparably priced GeForce 4 Ti 4600 ($399 [[USD]]), which was the performance leader but only DirectX 8.1 compliant.
Despite the lackluster performance for its price, Matrox hoped to win over enthusiasts with the Parhelia's unique and high quality features, such as "Surround Gaming", glyph acceleration, high resolutions, and 16x fragment anti-aliasing. In these aspects, some reviewers{{Like whom?|date=December 2023}} suggested that Parhelia could have been a compelling alternative to the comparably priced GeForce 4 Ti 4600 ($399 [[USD]]), which was the performance leader but only DirectX 8.1 compliant.


However, within a few months after release, the Parhelia was completely overshadowed by [[ATI]]'s far faster and fully DirectX 9.0 compliant [[Radeon R300|Radeon 9700]]. The Radeon 9700 was faster and produced higher quality 3D images, while debuting at the same price point as the Parhelia ($399 [[USD]]). Due to their equivalent pricing against faster cards, the Parhelia never got a significant hold in the market. It remains a niche product today while nVidia and ATI control the majority of the discrete graphics chip market.
However, within a few months after release, the Parhelia was completely overshadowed by [[ATI Technologies|ATI]]'s far faster and fully DirectX 9.0 compliant [[Radeon R300|Radeon 9700]]. The Radeon 9700 was faster and produced higher quality 3D images, while debuting at the same price as the Parhelia ($399 [[USD]]). Due to their equivalent pricing against faster cards, the Parhelia never got a significant hold in the market. It remained a niche product, while Nvidia and ATI controlled the majority of the discrete graphics chip market.


==Parhelia-LX==
==Parhelia-LX==
Line 40: Line 44:


==Future products==
==Future products==
Originally, Matrox planned to produce the 'Parhelia 2' successor, codenamed 'Pitou'.<ref>[http://www.forum-3dcenter.org/vbulletin/showthread.php?s=&threadid=39498 Parhelia II Codename "Pitou"]</ref> However, when Parhelia-512 failed to compete in the gaming market, the project was never again mentioned.
Originally, Matrox planned to produce the "Parhelia 2" successor, codenamed "Pitou".<ref>[http://www.forum-3dcenter.org/vbulletin/showthread.php?s=&threadid=39498 Parhelia II Codename "Pitou"]</ref> However, when Parhelia-512 failed to compete in the gaming market, the project was never again mentioned and Matrox left the gaming market altogether by 2003.


Parhelia processors were later upgraded to support AGP 8x, and PCI Express.
Parhelia processors were later upgraded to support AGP , and PCI Express.


In 2006, Matrox re-introduced Surround Gaming with their ''TripleHead2Go'', which utilizing the existing GPU to render 3D graphics, splitting the resulting image over three screens.<ref>[http://arstechnica.com/news.ars/post/20060302-6303.html Matrox brings triple displays to gaming with TripleHead2Go]</ref><ref>[http://matrox.com/graphics/en/press/releases/2006/cadgis/th2go/ New Matrox TripleHead2Go external upgrade offers support for 3 monitors at a time]</ref> Certified products include [[ATI]] and [[NVIDIA]] (and later Intel) processors.
In 2006, Matrox re-introduced Surround Gaming with their ''TripleHead2Go'', which utilizing the existing GPU to render 3D graphics, splitting the resulting image over three screens.<ref>[https://arstechnica.com/news.ars/post/20060302-6303.html Matrox brings triple displays to gaming with TripleHead2Go]</ref><ref>[http://matrox.com/graphics/en/press/releases/2006/cadgis/th2go/ New Matrox TripleHead2Go external upgrade offers support for 3 monitors at a time]</ref> Certified products include [[ATI Technologies|ATI]] and [[NVIDIA]] (and later Intel) processors.


With the introduction of Millennium P690 in 2007, it was die-shrunk to 90&nbsp;nm, and supports DDR2 memory.<ref>[http://matrox.com/graphics/en/press/releases/2007/corpo/p690series/ Matrox Graphics announces Millennium P690 Series fanless graphics cards]</ref> Windows Vista is supported under XP Driver Model.
With the introduction of Millennium P690 in 2007, it was die-shrunk to 90&nbsp;nm, and supports DDR2 memory.<ref>[http://matrox.com/graphics/en/press/releases/2007/corpo/p690series/ Matrox Graphics announces Millennium P690 Series fanless graphics cards]</ref> Windows Vista is supported under XP Driver Model.


In 2008-06-25, Matrox announced the release of M-Series video cards.<ref>[http://matrox.com/graphics/en/press/releases/2008/corpo/mseries/ Introducing Matrox M-Series - graphics cards for stretched desktop applications powered by industry's first true QuadHead GPU]</ref> It has the advertised single-chip quad head support. Unlike previous products, it supports Windows Vista Aero acceleration.
In June 2008, Matrox announced the release of M-Series video cards.<ref>[http://matrox.com/graphics/en/press/releases/2008/corpo/mseries/ Introducing Matrox M-Series - graphics cards for stretched desktop applications powered by industry's first true QuadHead GPU]</ref> It has the advertised single-chip quad head support. Unlike previous products, it supports Windows Vista Aero acceleration.

In 2014, Matrox announced the next line of multi-display graphics cards would be based on 28&nbsp;nm AMD GPUs with Graphics Core Next technologies with DirectX 11.2, OpenGL 4.4 and OpenCL 1.2 compatibility; shader model 5.0; PCI Express 3.0 and 128-bit memory interface.<ref>[http://www.matrox.com/graphics/en/press/releases/2014/graphics_cards/amd/ Matrox Chooses AMD GPU for Next Generation Multi-display Graphics Cards]</ref> The first AMD-based products, Matrox C420 and C680, was set to be available in Q4 2014.<ref>[http://www.matrox.com/graphics/en/press/releases/2014/graphics_cards/c-series/ Matrox Unveils Quad and Six-Head PCI Express Graphics Cards] {{Dead link|date=December 2023}}</ref>


==References==
==References==
{{reflist|2}}
{{reflist}}


==External links==
==External links==
Line 59: Line 65:
*[http://www.anandtech.com/video/showdoc.html?i=1645 AnandTech: Matrox's Parhelia - A Performance Paradox]*/
*[http://www.anandtech.com/video/showdoc.html?i=1645 AnandTech: Matrox's Parhelia - A Performance Paradox]*/
-->
-->
*[http://www.tomshardware.com/2002/05/14/matrox_parhelia/index.html Tom's Hardware Preview]
*[https://archive.today/20130204112644/http://www.tomshardware.com/2002/05/14/matrox_parhelia/index.html Tom's Hardware Preview]
*[http://www.tomshardware.com/2002/06/25/attack_out_of_the_blind_spot/index.html Tom's Hardware Review]
*[http://www.tomshardware.com/2002/06/25/attack_out_of_the_blind_spot/index.html Tom's Hardware Review]


{{Matrox Graphics Cards}}
{{Matrox Graphics Cards}}
{{Graphics Processing Unit}}


[[Category:Video cards]]
[[Category:Graphics processing units]]
[[Category:Graphics cards]]


[[ko:매트록스 파헬리아]]
[[ja:Parhelia]]
[[zh:幻日]]
[[zh:幻日]]

Latest revision as of 11:02, 29 June 2024

Parhelia AGP 128 MB
Parhelia PCI-X 256 MB

The Matrox Parhelia-512 is a graphics processing unit (GPU) released by Matrox in 2002. It has full support for DirectX 8.1 and incorporates several DirectX 9.0 features. At the time of its release, it was best known for its ability to drive three monitors ("Surround Gaming") and its Coral Reef tech demo.

As had happened with previous Matrox products, the Parhelia was released just before competing companies released cards that completely outperformed it. In this case it was the ATI Radeon 9700, released only a few months later. The Parhelia remained a niche product, and was Matrox's last major effort to sell into the consumer market.

Background

[edit]

The Parhelia series was Matrox's attempt to return to the market after a long hiatus, their first significant effort since the G200 and G400 lines had become uncompetitive. Their other post-G400 products, G450 and G550, were cost-reduced revisions of G400 technology and were not competitive with ATI's Radeon or NVIDIA's GeForce lines with regards to 3D computer graphics.

Description

[edit]

Features

[edit]

The Parhelia-512 was the first GPU by Matrox to be equipped with a 256-bit memory bus, giving it an advantage over other cards of the time in the area of memory bandwidth. The "-512" suffix refers to the 512-bit ring bus. The Parhelia processor featured Glyph acceleration, where anti-aliasing of text was accelerated by the hardware.

Parhelia-512 includes 4 32×4 vertex shaders with dedicated displacement mapping engine, pixel shader array with 4 texturing unit and 5-stage pixel shader per pixel pipeline. It supports 16× fragment anti-aliasing, all of which were featured prominently in Matrox's Coral Reef technical demo.

Display controller component supports 10-bit color frame buffer (called "Gigacolor") with 10-bit 400 MHz RAMDACs on 2 RGB ports and 230 MHz RAMDAC on TV encoder port, which was an improvement over its competitors. The frame buffer is in RGBA (10:10:10:2) format, and supports full gamma correction. Dual link TMDS is supported via external controller connected to the digital interface.

Memory controller supports 256-bit DDR SDRAM.

The "Surround Gaming" support allowed the card to drive three monitors creating a unique level gaming immersion. For example, in a flight simulator or sim racing, the middle monitor could show the windshield while the left and right monitors could display the side views (offering peripheral vision). However, only 2 displays can be controlled independently.[1]

Video cards

[edit]

The cards were released in 2002, simply called Matrox Parhelia, initially came with 128 or 256 MiB memory. Retail cards are clocked 220 MHz core, 275 MHz memory; OEM cards are clocked 200 MHz core, 250 MHz memory.[2]

To further improve analog image quality, 5th order low-pass filters are used.

Performance

[edit]
Parhelia chip

For a top-of-the-line, and rather expensive card ($399 USD), the Matrox Parhelia's 3D gaming performance was well behind NVIDIA's older and similarly priced GeForce 4 Ti 4600. The Parhelia was only competitive with the older Radeon 8500 and GeForce 3, which typically cost half as much. The Parhelia's potent performance was held back by its comparatively low GPU clock speed (220 MHz for retail model, 200 MHz for OEM and 256 MB models), initially believed by many commentators to be due to the large (for that time-frame) transistor count. However, ATI's Radeon 9700 was released later that year, with a considerably larger transistor count (108 million vs. 80 million), on the same 150 nm chip fabrication process, yet managed a substantially higher clock (325 MHz vs. 250 MHz).

The card's fillrate performance was formidable[citation needed] in games that used many texture layers; though equipped with just 4 pixel pipelines, each had 4 texture units. This proved not to be an efficient arrangement in most situations. Parhelia was also hampered by poor bandwidth conserving technologies/techniques; ATI introduced their 3rd gen HyperZ in Radeon 9700, NVIDIA touted Lightning Memory Architecture 2 for the GeForce 4 series, Matrox had no similarly comprehensive optimization approach. While the Parhelia possessed an impressive[fact or opinion?] raw memory bandwidth much of it was wasted on invisible house-keeping tasks because the card lacked the ability to predict overdraw or compress z-buffer data, among other inefficiencies. Some writers believed Parhelia to have a "crippled" triangle-setup engine that starved the rest of the chip in typical 3D rendering tasks [1].

Later in Parhelia's life, when DirectX 9 applications were becoming quite prevalent, Matrox acknowledged that the vertex shaders were not Shader Model 2.0 capable, and as such not DirectX 9-compliant, as was initially advertised. Presumably there were several bugs within the Parhelia core that could not be worked around in the driver.[speculation?] However, it was all a bit of a moot point because Parhelia's performance was not adequate to drive most DirectX 9-supporting titles well even without more complex shader code weighing the card down.

Sales

[edit]

Despite the lackluster performance for its price, Matrox hoped to win over enthusiasts with the Parhelia's unique and high quality features, such as "Surround Gaming", glyph acceleration, high resolutions, and 16x fragment anti-aliasing. In these aspects, some reviewers[like whom?] suggested that Parhelia could have been a compelling alternative to the comparably priced GeForce 4 Ti 4600 ($399 USD), which was the performance leader but only DirectX 8.1 compliant.

However, within a few months after release, the Parhelia was completely overshadowed by ATI's far faster and fully DirectX 9.0 compliant Radeon 9700. The Radeon 9700 was faster and produced higher quality 3D images, while debuting at the same price as the Parhelia ($399 USD). Due to their equivalent pricing against faster cards, the Parhelia never got a significant hold in the market. It remained a niche product, while Nvidia and ATI controlled the majority of the discrete graphics chip market.

Parhelia-LX

[edit]

After the launch of Parhelia-512, Matrox released Parhelia-LX, which supports only 128-bit memory and has only 2 pixel pipelines. The first video cards using it included Matrox Millennium P650 and Millennium P750.

Future products

[edit]

Originally, Matrox planned to produce the "Parhelia 2" successor, codenamed "Pitou".[3] However, when Parhelia-512 failed to compete in the gaming market, the project was never again mentioned and Matrox left the gaming market altogether by 2003.

Parhelia processors were later upgraded to support AGP 8×, and PCI Express.

In 2006, Matrox re-introduced Surround Gaming with their TripleHead2Go, which utilizing the existing GPU to render 3D graphics, splitting the resulting image over three screens.[4][5] Certified products include ATI and NVIDIA (and later Intel) processors.

With the introduction of Millennium P690 in 2007, it was die-shrunk to 90 nm, and supports DDR2 memory.[6] Windows Vista is supported under XP Driver Model.

In June 2008, Matrox announced the release of M-Series video cards.[7] It has the advertised single-chip quad head support. Unlike previous products, it supports Windows Vista Aero acceleration.

In 2014, Matrox announced the next line of multi-display graphics cards would be based on 28 nm AMD GPUs with Graphics Core Next technologies with DirectX 11.2, OpenGL 4.4 and OpenCL 1.2 compatibility; shader model 5.0; PCI Express 3.0 and 128-bit memory interface.[8] The first AMD-based products, Matrox C420 and C680, was set to be available in Q4 2014.[9]

References

[edit]
[edit]