In San Jose, NVIDIA is holding its annual conference, GTC, whose name stands so - GPU Technology Conference. As its name implies, this conference is dedicated to graphics accelerators and all the technologies around them. Just
want to warn you that the article - not for those looking for the
specification of new irons and read carefully to the last comma in the
specifications. I'm simply not interested, because the glands are not eternal, their time passes quickly. For
my taste, it is necessary to talk about the changes that NVIDIA is
trying to bring into the world, how to change an idea of what the
video accelerators and what they may be used. One
change, which is no surprise - is the conversion of graphics
accelerators in supercomputers that can cheat the complex mathematical
model. In
the NVIDIA created a new market, which is based on the capabilities of
traditional graphics accelerators, where the word "normal" does not
denote a particular product, but the idea behind it. As
if turned conceptual apparatus, when one of the peripherals has
suddenly appeared is not so unimportant, and with it made possible a
complex calculation. This
is one of the characteristics of three-dimensional graphics, which is
gluttonous for computing power and development in the direction of
specific pieces of iron allows you to create supercomputers. This
is not surprising, and if you follow the market, then you know that
supercomputers based on NVIDIA boards are used around the world, and one
of them even set in Russia (Moscow State University).
I specifically cite an example of how the NVIDIA changed the idea of supercomputers, in broad strokes to describe what a company trying to make their next move. Neither more nor less - once again change our understanding of the calculations and that can get the average user in three-dimensional graphics, without buying a special accelerator or using software on a mobile device, such as phone or tablet, where the installation of the accelerator is simply impossible.
As an analogy comes to mind the beginning of the development of computer technology and the era of mainframes. In those days, computers occupied huge rooms, and are accessed from terminals (thin clients). Later with the advent of cheap personal computers, we have moved to client-server model, in which the client had the computing power and was able to store information. The idea of "thin" client, consisting only of the screen and keyboard, while storing data on a server, almost did not take, people like to store their data at hand. It is possible that without this mental block, and we saw a cloud of technology a few decades ago. On the other hand, their appearance is not possible without the development of relevant technologies, data compression, data transmission infrastructure, which was developed only in the last ten years and reached the minimum required for this level. That level is minimal, even today, it is easy to see, trying to recover from its iCloud iPad, the process can take up to several days, the lucky ones manage to do it for a few hours. Although the size of the copies is very small, and locally from your laptop or PC, you can restore it for ten minutes. A limitation of the dimensions are the data centers, as well as the capacity of the channels. But everything is changing, and in 2011 began active development of new cloud services which promote a different angle all the same idea of a remote data storage.
Let's try a little longer to develop this idea. If the cloud, you can store your data, why can not there to make computing power of your computer? This idea of mainframe era, and we know that it works. Over the last decade, a host of desktop virtualization technology, including VDI-Technology (Virtual Desktop Infrastructure). A very simplified explanation of how it works - all your software resides on the server, and you broadcast a picture with you and work. A necessary condition for only one thing - you need access to the network to receive data. The qualitative characteristics of the Internet channel does not affect the imagination, does not require any dedicated lines or something like that.Immediately come to mind fi books in which this technology was described repeatedly. The brave explorer (I wonder whether there are other such books?) Stumbles upon something unknown to the newly discovered planet, transmitting data to the ship, which analyzes them. Sound familiar? In fact, we are faced with the same architecture when there is a "thin" client (we have a researcher), and the server (the ship). And everything depends solely connectivity between them. Digress for a moment aside and say that in the near future will demand technology as protection of data transmission channels, mainly terrestrial, and their destruction (jamming). If we talk about military technology, the fleets and armies will unite in a kind of virtual network, which will be a key element of data centers, predicting one or another of the enemy. In contrast, civilian applications, units of vehicles will be equipped with powerful local computers to be able to operate in the conditions of the general unavailability of military networks. But enough about the near future, let's look at the present day.In NVIDIA once again decided to replace the concept of good sense and suggested that the technology NVIDIA VGX (GPU-VDI), in which virtualization is achieved by the working space, you guessed it with the help of graphics accelerators. On the side of the data center installed graphics cards NVIDIA - they use the architecture of Kepler, is the latest generation of NVIDIA today. Access to these accelerators may receive multiple users simultaneously, each will receive the amount of computing power they need. The effectiveness of such a decision can not be overestimated, de facto we get rid of the need to create at each workplace productivity stations, giving all the calculations in the cloud. Another point that is associated with energy efficiency, it is very important - today's data centers of major companies, such as Google, Microsoft, Amazon, have reached a limit on the release of thermal energy. With increasing computing power equipment cooling costs increase dramatically, making a linear expansion of the number of racks of servers impossible, then the efficiency is required. I wonder if anyone considered how much heat today emit into the environment, all computers and devices on Earth, I think that number is huge.Another limitation is the impossibility of linear expansion due to lack of enough energy to power the data center, they consume more than most industries, but are located close to consumers (ie, outside the industrial zones). For example, in Moscow, I remember how difficult Yandex territory sought for its data center, as the necessary amount of energy has been found extremely difficult. Exactly the same problem facing the Russian office of NVIDIA, which is a large data center and testing center software (mostly games). When searching for office space has become the main criterion for power equipment building, and its location pushed aside.Thus, the virtual desktop technology from NVIDIA based on the graphic accelerator, which is connected via the PCI Express connector to the server. One board contains 4 GPU and 16GB of memory, it can serve up to one hundred users (however, there is the question of what is the minimum configuration, but not heavy computation).Perhaps NVIDIA VGX association with software technology GPU Hypervision, it integrates hardware into software products of other companies, for example, Citrix XenServer, thus achieving virtualization GPU.Finally, at the level of service to end users present technology User Selectable Machines (USM). With the help of the corporate network administrator may give priority to a particular machine, while providing it with other opportunities. For example, in the lite version it will be a regular schedule, typical business applications, in the interim it will increase the quality, as well as the number of additional functions. Look at the sign to see the difference, I think of it, all will become clear.
In fact, we say that using this technology, artists and designers will be able to handle large files on almost any hardware, the actual payments will go to the server where you installed NVIDIA VGX. This gives us the main differences between the current technology on what we've seen to date. Here I was a little lukavlyu as technology virtual desktops has long been known and used by many companies, as well as server computing, such as training centers, researchers, and so on. The novelty of the proposal NVIDIA only that the company has in its field, offering the first opportunity to get great graphics, regardless of your actual hardware (there is certainly some physical limitations, for example, if you display has a low response time, then play it does not work).
In fact, the first virtual desktops were scheduled at the good and the usual PC. For the first time it is possible to obtain an appropriate level of graphics on mobile devices such as tablets. In another way this can be done as a tablet set in a powerful graphics card is impossible - there will always be a gap between the level of cards in stand-alone PCs and their mobile versions, which require low power consumption. However, let's look at a few demonstration videos from NVIDIA, I think they speak for themselves, and if it will be possible for the tablets, then we can talk about a fundamentally new opportunities for market development. To some extent, we take another step towards infrastructure, which is a device that is in your hands, will become less powerful, but long-lived, but its main feature will screen and a thick channel for data transmission.Technology, of which we have talked already exist, it is not demo versions, and they will start to unfold in the commercial version in the second half of 2012. As you can see, NVIDIA tend to occupy all the niches associated with the graphics as well as with adjacent areas. For example, the presented technology of interest not only for desktop virtualization, but it's part of supercomputers, the replacement of existing GPU, with a more efficient data processing (CUDA technology).And, in my opinion, is very telling slide that is how the researchers used a similar calculation in 2006 and how many of them are using them today.
The virtual game console on your PC, TV and tablet
Do you know what the main problem in the gaming industry for consoles? This is a rather slow development of the hardware of the consoles and the need to focus on their long life cycle, as well as sell the console for little money, making profits from the games. By creating a game console, the manufacturer should provide some cushion, so it is impossible to update the model annually. Perhaps, it allows the PC industry has successfully resist the invasion of the consoles, here the flexibility to install new video card later. But here the main problem is that most potential buyers do not play the last iron, which reduces the number of possible sales of the newest toys. Either forces developers to spend money on it to create versions for different hardware, resulting in some money and nerves. Bottleneck, as usual, is the schedule.
Now let's imagine that you have the option not to buy the newest hardware, but you can play various games from your PC srednenkogo. Such a service already exists, it www.onlive.com. You get to play the latest games on TV, tablet, phone, your laptop. Here we use the subscription option you can for $ 10 a month to play an unlimited number of times in a set of games. For individual games can be used by another payment option, but the mechanics should be clear to you, you do not pay for a copy of the game and connect to the server Onlive.The disadvantage of the first swallow is a delay in connectivity and remote servers, play some games is hard, and for professional players as long as this service and does not apply. However, it has several advantages. First, the issue of piracy for the owners simply not worth it. The player is not a local copy of the game, all stored on the server, and steal anything is impossible. Second, you can play with different devices, your identity is tied only to what you bought. And, in my opinion, it is also a huge contrast to the classical model. For example, to play the game on the console and the PC at the same time difficult and not necessary, as a rule, your progress is not saved. Then all things are possible.As NVIDIA does not have to reinvent the wheel, and suggested that the market exactly the same thing, but with a number of characteristic differences relative schedules. The technology is called GeForce Grid, in fact, is the same as Kepler, but is used to broadcast games and their graphics processing. The fundamental difference from the Onlive or any other cloud service that is issued directly to the flow, each player gets it right from the graphics accelerator, which has its own IP-address. The processor is relegated to the background, which can dramatically reduce delays in transferring images to the target device.We must understand that there are no miracles, and the GeForce Grid - this improvement is already known and existing technologies. As usual, the NVIDIA are all built around a graphic. During the demonstration we were shown a three-dimensional game in which you can quickly and move realistically. Clearly, this is a demonstration in real life will again abut the canal, and is where the servers are located, what is your Internet connection (rather average, but disconnections are allowed).In NVIDIA announced GeForce Grid that will be used by a number of companies who provide cloud game, for example, it will be www.gaikai.com, one of the strongest players in this field.
As a small summary of the
Not so important to shoot the technologies from NVIDIA in all aspects, or not. In most data centers who provide remote power for computing, has applied GPU from NVIDIA. This market has emerged out of nowhere, and it continues to grow, a huge number of companies working in different areas, are eager for the calculations. At the current conference GTC company has made it clear that she was interested in playing and in the market for consumers, which is logical. Providing technology for data centers, NVIDIA took the next step - to adapt the technology for corporations (VDI), and tried to make the appropriate product for end users (remote play). In my opinion, regardless of how the market will develop, NVIDIA have secured its future. Win back the concept of local PCs, some useful graphics accelerators. Cloud gaming market will explode, and then the company has its own proposal.But the main thing that I thought was very important, is that changing the paradigm of the market. Cloud computing - a new name for old technologies, and the only restriction for distribution to users is the fear to trust the cloud the critical amount of data. The only question is whether different companies overcome these beliefs or not. For example, I'm not ready to give the cloud its data, but only as secure backups or small services (synchronization of calendars, files, and so on). Are you ready?On the other hand, I am delighted to consume the game from the cloud, if the quality is high and I do not see the difference between the local copy of the game and the one on the server. Also, I'll be quite happy if I do not need to buy Adobe Photoshop to edit two or three files for a year, and I will be able to pay for its use when I need it. And the pay will be tied to the amount of work that I did (this may be processing the file, and perhaps its size, the monetization models here can be very much). The fact that the new / old market moves to other points of reference, and as soon as a universal yardstick for these calculations can be hours of CPU time, and not something else. Imagine advertising, in which the same Adobe says that the program is better because it spends less CPU time? Approximately how advertising modern fuel-efficient cars. Of course, this has dreams or imagination of the future, but we are approaching it very quickly, I would say that we are rushing into it without looking around.
I specifically cite an example of how the NVIDIA changed the idea of supercomputers, in broad strokes to describe what a company trying to make their next move. Neither more nor less - once again change our understanding of the calculations and that can get the average user in three-dimensional graphics, without buying a special accelerator or using software on a mobile device, such as phone or tablet, where the installation of the accelerator is simply impossible.
As an analogy comes to mind the beginning of the development of computer technology and the era of mainframes. In those days, computers occupied huge rooms, and are accessed from terminals (thin clients). Later with the advent of cheap personal computers, we have moved to client-server model, in which the client had the computing power and was able to store information. The idea of "thin" client, consisting only of the screen and keyboard, while storing data on a server, almost did not take, people like to store their data at hand. It is possible that without this mental block, and we saw a cloud of technology a few decades ago. On the other hand, their appearance is not possible without the development of relevant technologies, data compression, data transmission infrastructure, which was developed only in the last ten years and reached the minimum required for this level. That level is minimal, even today, it is easy to see, trying to recover from its iCloud iPad, the process can take up to several days, the lucky ones manage to do it for a few hours. Although the size of the copies is very small, and locally from your laptop or PC, you can restore it for ten minutes. A limitation of the dimensions are the data centers, as well as the capacity of the channels. But everything is changing, and in 2011 began active development of new cloud services which promote a different angle all the same idea of a remote data storage.
Let's try a little longer to develop this idea. If the cloud, you can store your data, why can not there to make computing power of your computer? This idea of mainframe era, and we know that it works. Over the last decade, a host of desktop virtualization technology, including VDI-Technology (Virtual Desktop Infrastructure). A very simplified explanation of how it works - all your software resides on the server, and you broadcast a picture with you and work. A necessary condition for only one thing - you need access to the network to receive data. The qualitative characteristics of the Internet channel does not affect the imagination, does not require any dedicated lines or something like that.Immediately come to mind fi books in which this technology was described repeatedly. The brave explorer (I wonder whether there are other such books?) Stumbles upon something unknown to the newly discovered planet, transmitting data to the ship, which analyzes them. Sound familiar? In fact, we are faced with the same architecture when there is a "thin" client (we have a researcher), and the server (the ship). And everything depends solely connectivity between them. Digress for a moment aside and say that in the near future will demand technology as protection of data transmission channels, mainly terrestrial, and their destruction (jamming). If we talk about military technology, the fleets and armies will unite in a kind of virtual network, which will be a key element of data centers, predicting one or another of the enemy. In contrast, civilian applications, units of vehicles will be equipped with powerful local computers to be able to operate in the conditions of the general unavailability of military networks. But enough about the near future, let's look at the present day.In NVIDIA once again decided to replace the concept of good sense and suggested that the technology NVIDIA VGX (GPU-VDI), in which virtualization is achieved by the working space, you guessed it with the help of graphics accelerators. On the side of the data center installed graphics cards NVIDIA - they use the architecture of Kepler, is the latest generation of NVIDIA today. Access to these accelerators may receive multiple users simultaneously, each will receive the amount of computing power they need. The effectiveness of such a decision can not be overestimated, de facto we get rid of the need to create at each workplace productivity stations, giving all the calculations in the cloud. Another point that is associated with energy efficiency, it is very important - today's data centers of major companies, such as Google, Microsoft, Amazon, have reached a limit on the release of thermal energy. With increasing computing power equipment cooling costs increase dramatically, making a linear expansion of the number of racks of servers impossible, then the efficiency is required. I wonder if anyone considered how much heat today emit into the environment, all computers and devices on Earth, I think that number is huge.Another limitation is the impossibility of linear expansion due to lack of enough energy to power the data center, they consume more than most industries, but are located close to consumers (ie, outside the industrial zones). For example, in Moscow, I remember how difficult Yandex territory sought for its data center, as the necessary amount of energy has been found extremely difficult. Exactly the same problem facing the Russian office of NVIDIA, which is a large data center and testing center software (mostly games). When searching for office space has become the main criterion for power equipment building, and its location pushed aside.Thus, the virtual desktop technology from NVIDIA based on the graphic accelerator, which is connected via the PCI Express connector to the server. One board contains 4 GPU and 16GB of memory, it can serve up to one hundred users (however, there is the question of what is the minimum configuration, but not heavy computation).Perhaps NVIDIA VGX association with software technology GPU Hypervision, it integrates hardware into software products of other companies, for example, Citrix XenServer, thus achieving virtualization GPU.Finally, at the level of service to end users present technology User Selectable Machines (USM). With the help of the corporate network administrator may give priority to a particular machine, while providing it with other opportunities. For example, in the lite version it will be a regular schedule, typical business applications, in the interim it will increase the quality, as well as the number of additional functions. Look at the sign to see the difference, I think of it, all will become clear.
In fact, we say that using this technology, artists and designers will be able to handle large files on almost any hardware, the actual payments will go to the server where you installed NVIDIA VGX. This gives us the main differences between the current technology on what we've seen to date. Here I was a little lukavlyu as technology virtual desktops has long been known and used by many companies, as well as server computing, such as training centers, researchers, and so on. The novelty of the proposal NVIDIA only that the company has in its field, offering the first opportunity to get great graphics, regardless of your actual hardware (there is certainly some physical limitations, for example, if you display has a low response time, then play it does not work).
In fact, the first virtual desktops were scheduled at the good and the usual PC. For the first time it is possible to obtain an appropriate level of graphics on mobile devices such as tablets. In another way this can be done as a tablet set in a powerful graphics card is impossible - there will always be a gap between the level of cards in stand-alone PCs and their mobile versions, which require low power consumption. However, let's look at a few demonstration videos from NVIDIA, I think they speak for themselves, and if it will be possible for the tablets, then we can talk about a fundamentally new opportunities for market development. To some extent, we take another step towards infrastructure, which is a device that is in your hands, will become less powerful, but long-lived, but its main feature will screen and a thick channel for data transmission.Technology, of which we have talked already exist, it is not demo versions, and they will start to unfold in the commercial version in the second half of 2012. As you can see, NVIDIA tend to occupy all the niches associated with the graphics as well as with adjacent areas. For example, the presented technology of interest not only for desktop virtualization, but it's part of supercomputers, the replacement of existing GPU, with a more efficient data processing (CUDA technology).And, in my opinion, is very telling slide that is how the researchers used a similar calculation in 2006 and how many of them are using them today.
The virtual game console on your PC, TV and tablet
Do you know what the main problem in the gaming industry for consoles? This is a rather slow development of the hardware of the consoles and the need to focus on their long life cycle, as well as sell the console for little money, making profits from the games. By creating a game console, the manufacturer should provide some cushion, so it is impossible to update the model annually. Perhaps, it allows the PC industry has successfully resist the invasion of the consoles, here the flexibility to install new video card later. But here the main problem is that most potential buyers do not play the last iron, which reduces the number of possible sales of the newest toys. Either forces developers to spend money on it to create versions for different hardware, resulting in some money and nerves. Bottleneck, as usual, is the schedule.
Now let's imagine that you have the option not to buy the newest hardware, but you can play various games from your PC srednenkogo. Such a service already exists, it www.onlive.com. You get to play the latest games on TV, tablet, phone, your laptop. Here we use the subscription option you can for $ 10 a month to play an unlimited number of times in a set of games. For individual games can be used by another payment option, but the mechanics should be clear to you, you do not pay for a copy of the game and connect to the server Onlive.The disadvantage of the first swallow is a delay in connectivity and remote servers, play some games is hard, and for professional players as long as this service and does not apply. However, it has several advantages. First, the issue of piracy for the owners simply not worth it. The player is not a local copy of the game, all stored on the server, and steal anything is impossible. Second, you can play with different devices, your identity is tied only to what you bought. And, in my opinion, it is also a huge contrast to the classical model. For example, to play the game on the console and the PC at the same time difficult and not necessary, as a rule, your progress is not saved. Then all things are possible.As NVIDIA does not have to reinvent the wheel, and suggested that the market exactly the same thing, but with a number of characteristic differences relative schedules. The technology is called GeForce Grid, in fact, is the same as Kepler, but is used to broadcast games and their graphics processing. The fundamental difference from the Onlive or any other cloud service that is issued directly to the flow, each player gets it right from the graphics accelerator, which has its own IP-address. The processor is relegated to the background, which can dramatically reduce delays in transferring images to the target device.We must understand that there are no miracles, and the GeForce Grid - this improvement is already known and existing technologies. As usual, the NVIDIA are all built around a graphic. During the demonstration we were shown a three-dimensional game in which you can quickly and move realistically. Clearly, this is a demonstration in real life will again abut the canal, and is where the servers are located, what is your Internet connection (rather average, but disconnections are allowed).In NVIDIA announced GeForce Grid that will be used by a number of companies who provide cloud game, for example, it will be www.gaikai.com, one of the strongest players in this field.
As a small summary of the
Not so important to shoot the technologies from NVIDIA in all aspects, or not. In most data centers who provide remote power for computing, has applied GPU from NVIDIA. This market has emerged out of nowhere, and it continues to grow, a huge number of companies working in different areas, are eager for the calculations. At the current conference GTC company has made it clear that she was interested in playing and in the market for consumers, which is logical. Providing technology for data centers, NVIDIA took the next step - to adapt the technology for corporations (VDI), and tried to make the appropriate product for end users (remote play). In my opinion, regardless of how the market will develop, NVIDIA have secured its future. Win back the concept of local PCs, some useful graphics accelerators. Cloud gaming market will explode, and then the company has its own proposal.But the main thing that I thought was very important, is that changing the paradigm of the market. Cloud computing - a new name for old technologies, and the only restriction for distribution to users is the fear to trust the cloud the critical amount of data. The only question is whether different companies overcome these beliefs or not. For example, I'm not ready to give the cloud its data, but only as secure backups or small services (synchronization of calendars, files, and so on). Are you ready?On the other hand, I am delighted to consume the game from the cloud, if the quality is high and I do not see the difference between the local copy of the game and the one on the server. Also, I'll be quite happy if I do not need to buy Adobe Photoshop to edit two or three files for a year, and I will be able to pay for its use when I need it. And the pay will be tied to the amount of work that I did (this may be processing the file, and perhaps its size, the monetization models here can be very much). The fact that the new / old market moves to other points of reference, and as soon as a universal yardstick for these calculations can be hours of CPU time, and not something else. Imagine advertising, in which the same Adobe says that the program is better because it spends less CPU time? Approximately how advertising modern fuel-efficient cars. Of course, this has dreams or imagination of the future, but we are approaching it very quickly, I would say that we are rushing into it without looking around.
No comments:
Post a Comment