hgb said:I see some impacts, like some change all is impacted in some way, eg.. supose that Adobe do photoshop at web enviroment, that mean that you will not need install the binaries (and sure.. you cant crack it), Adobe will have more control over his product :D, and if they never deliver one copy of his binaries, then there will not be posibility of have a personal server with photoshop cracked, you will need pay or subscribe for use it, also for crack it, you will need crack the server(s) that offer photoshop, but they can repatch the holes, and get you out... You will need to pay for use it. Also you will need have a good connection (you/I will need pay more).


I think that software for rent will kill the industry more than GPL would and anything else. I'll give you a few examples.

1) VirtualDrive (http://www.farstone.com). I paid $129 US for 5 licenses of the professional version to use on my various PC's at home. I have a 1 year active subscription where I can live-update anytime I want. When that 1 year expires, I cannot live update. This means if my PC crashes on the 370th day, and I need to reinstall Windows and all the other software, then I will not be able to live-update without paying again for another "subscription". For virus protection, there's no complaints. But for software that they say gives you a perpetual license but under the hood, that's not that case, it makes me lose motivation.

2) Activation Required. For MS software I can tolerate it (though I don't like it). But for other software, such as MathCad 11 (http://www.mathsoft.com) and Photoshop CS (http://www.adobe.com) I simply do not upgrade the software anymore because I don't like being locked to a single machine or a limited number of configurations. However, other activation type software such as Libronix (http://www.libronix.com) is great where you activate and get a key that you can back up and then restore as many times as you want but all the Ebooks you buy are hardlocked to your activation so you can't share books with 5 million of your internet friends. That's a good model. Rational (http://www.rational.com) also has a "floating" license concept where you connect to a license server (which is better than activation).

The industry seems to be moving towards the activation route. Even 3rd party ASP.NET components hardlock you to a domain. This means that if I have a website that is mydomain.com/.net/.org/.biz I have to purchase a license for every domain even if it is really the same one or a mirror. That's why I don't purchase any 3rd party components for the .NET environment, it's too costly. What happened to the good 'ole days when it was per developer?

3) Rental software. This is where it is all heading. They want you to pay and while you are paying you can use it and if you stop paying, you can't use. This can't work for too long. I pay thousands of dollars a year personally for software but only as I have the money and I don't constantly update everything. I still use 4 year old software that suits me just fine. But if I was forced to pay for everythine per month or annually, I would slice my budget until I used only what I needed (and become a OSS fanatic myself in the process). What if I ended up unemployed and couldn't afford to renew my Windows license or Office? Well, then I couldn't write my resume or do anything. That can't go well with the consumer crowd for long. Maybe a year or two tops until everyone realized they are just wasting their money away. Just another bill to add to the heap for each piece of software. That won't work well for those on a low budget.

Bottom line, is the industry is giving people a reason to move away from commercial software by their own greed. Used to be, we heard that if they could control Piracy it would lowever prices. Well, with activation, they mostly can. With subscription this, subscription that, they mostly can. But prices aren't lower, now they justify it by saying they can spend more time and money on R&D for new features but prices don't decrease. It's a sham, I tell you. And I find myself purchases less and less software as a result of these practices but I'm not "sharing" either, I just don't use it or recommend it. But I do tell people what they are sacrificing if they do use it and many I know won't use it either. It does them more damange and they bring it on them selves.


Thanks,
_shawn
Posted on 2004-06-18 12:50:06 by _Shawn
Let me just say that professional PhotoShop users don't work with images of say 1024x768 in 24 bit. And jpg compression is not acceptable either. They use images of many megabytes large. It is impossible to create proper interaction between a server and a client with that amount of data. Also, I doubt that browsers are built to handle images that large. Do they support more than 24 bit anyway? What about colour calibration etc?
In short, if you think it's possible, it's just because you have no clue.
Posted on 2004-06-18 13:00:27 by Scali

It is impossible to create proper interaction between a server and a client with that amount of data.


What about transfering the tools and not the image itself??

In short, if you think it's possible, it's just because you have no clue.

Do you seriously understand the rules of this board???
Posted on 2004-06-18 13:09:44 by pelaillo

Let me just say that professional PhotoShop users don't work with images of say 1024x768 in 24 bit. And jpg compression is not acceptable either. They use images of many megabytes large. It is impossible to create proper interaction between a server and a client with that amount of data. Also, I doubt that browsers are built to handle images that large. Do they support more than 24 bit anyway? What about colour calibration etc?
In short, if you think it's possible, it's just because you have no clue.


A friend of mine used to do professional graphics work back in the late 80's on the original Targa and Number 9 cards using Crystal Graphics. Even in those days of 200MB drives the images were always on the order of 10 to 15 MB each, they must have grown exponentially since then, I would not be surprised if 100MB images were now common (though I have no way of knowing). The thing is that you edit at extremely high resolution then render down, it gives you much better final quality than editing at the final target resolution. But these types of intensive applications have mostly moved to SG's and the like, even then he had an Iris to do the big stuff, ofcourse the separation between PC and graphics workstations is less with each successive generation of PCs so it is probably very likely that many are already doing it on PC "renderring farms".
Posted on 2004-06-18 13:15:21 by donkey
mmm a note on that (the points of _Shawn), the other day was thinking if I can deliver a OS that is 'excellent' and that never need a update (Ideally), and for get rid of the way of large companies (ok only one company), I will deliver oficial(s) CD with the price of 20 dollars (also this price is near to the price in piracy of a OS lin or win), the I guess that buy the piracy software or the original one by that price I will select the original, relegating piracy to near nothing.


But after that, supose that the OS is perfect and not need updates, then I will only gain money supose for 3 months in the time that all the world buy it, after that nothing. In the point of view of money, you need still winning money, for example a rentable bussiness is food, because all people need eat all the days at less 3 times at day, now how I can do OS production rentable?, like the food?, I whant to win money every day to the end (supose)....., then I will not deliver that quality of software, even if I can, because I will lost the rentability of a bussiness like the food.

Now what I think, if that is posible, have a centralized binary, and only can be used remotely, then the companies will do is be like the food comerce.

Remember that they whant to be more indispensable. What better way that speed connections and low a little the cost pheraphs and have centralized binaries, uping the power at server side, and lowing hte power at client side?, you will not have the ability for deliver good software without the use of his tools, and if they can obtain complete control over the distribution, what best?.


Also see that is a 'ideal escenary' is not the actuall live, pheraps a little paranoic, also I not think that (because I think is posible...... not mean that will happend). I dont know but some times I read that there are better ways of do programms, but they require more time in production, and the companies dont whant that.... :D, they whant fast production that feet like people say "the software fit my necesyties I not need more I not need less", they whant make programms, not quality software, also that will require a more strong knowledge by part of programmers, that will require more education and investiment.


Have a nice day or night.
Posted on 2004-06-18 13:16:32 by rea
Also I am talking about other type of interpreter, I discard completely a browser for get the aps to other type of interpreter of a type IRL. Browser will become old ones or the ancestors. Also yes, about the actual browser, also you see that are things that one can do and other cant do, also see that I am providing a diferent escenary, that pheraphs the time can show.... or maybe not .


It is impossible to create proper interaction between a server and a client with that amount of data.


I have one solution for this, translate at runtime the operation... mmmm... with the protection of the IRL interpreter you will never know the function, supose that you can trace it or watch in any way. Also the operation not need be all the time in your side, you will only load when you need it, and inmediately unload it. But you still without having the binary. Also I think that is what pelaillo say.
Posted on 2004-06-18 13:29:09 by rea
What about transfering the tools and not the image itself??


We already agreed that downloadable applications are not web applications, at least not in the sense that the article speaks of.
Posted on 2004-06-18 13:40:12 by Scali
But these types of intensive applications have mostly moved to SG's and the like


A friend of mine is a photographer, and she swears by her dual G5 Mac :)
I suppose Mac is the home ground for PhotoShop. I'll ask her what the size and resolution of the images are that she generally works with.
She also told me that she would use slide film rather than regular film, because it has a finer coating of chemicals, or whatever you call it. It gives a more 'high resolution' image, so the pictures are still sharp when blown up to huge advertisements and such. I wonder how big those images are when stored digitally.
Posted on 2004-06-18 13:43:57 by Scali
Not to be pedantic Scali, but remember that the actual way that we develop applications is not very centrated in translation of large data or diferent algorithms for handle large data in parts, giving the fill of a simple operation without time lost.

When will be investiment at this enviroment and will be "attacked" the contras, like this one that you point, there will be the ways of do it.


Is not the actual enviroment, then we dont have in fact real tools that can show that type of applications, I am only arguing (or argumentating to things that fill like posible), remember that time a go, where thinked a compiler, but after some time the first where showed, also all have a evolution, what you think that transfer large cuantities of data easely can not be done?, also remember that actually is not a central problem transfer large cuantities of data over the world, pheraphs transfer large cuantities of data fast over the HD and central memory systems is the actuall problem, but if that type of applications can be done, and the investigation is rentable, sure that the companies will try contribute new way of do things.



I previously state that the enviroment matters and affect and you need know the enviroment, we can see fairly the problems, but we can not attack them, because we dont have the tools for test it.


Also the one that you say about calibration that will depend on the hardware that you where using, because the data for be digital data will be the same and the operations over the data will be the same (also they are centralized), the IRL interpreter will depend on the system and this will be the one that will interact with the user for the things of calibration and suchs and others that are related to hardware, not the data and the data manipulation, the display is done by the interpreter, the data and operations is separated from that things, I dont see much problems with that.


Have a nice day or night.
Also remember that I am actually speacking of not the actual world, they are only supositions.

Also if you see, if that type of applications can be done, I am not much with them :o, because posible economic impacts, the stability of software, I think all will move to some free tools :D, instead of this centralized applications, what you think?.
Posted on 2004-06-18 13:58:53 by rea
hgb, if I understand you correctly, you say that it is possible to redesign browsers so they can run applications that we run standalone today. Ofcourse it is possible. Why bother though?
You might aswell rename an OS to 'browser' and you have the same result. And what is there to gain? Why would we even WANT to have everything running in a 'browser'?
Posted on 2004-06-18 14:38:28 by Scali

We already agreed that downloadable applications are not web applications, at least not in the sense that the article speaks of.

Suppose that your client-side app analyzes the image for some operation, for example a color correction. Then it sends that data back to the server to receive instructions without sending the image itself. Then your server side app, sends your cutting-edge technology color corrector as a series of instructions to be executed by the client on the image.

Just imagine what we will see in a near future in the software developement... don't be limited by the things you see today, that's the sense I take from the article.
Posted on 2004-06-18 14:38:38 by pelaillo
Does any1 smell financial oppertunities??
VMware??? Web based???

Consired how manny windows message are sen't when you move your mouse.... well update effect area. Compression will have a great BIG market to play with.

But some serious encryption is gonna be needed...
Posted on 2004-06-18 14:41:25 by Black iCE
mm, only a final note about what I put before :).



Hehehe, I scare you about the centralized binary?, and that you will never have the hand on the binary distribution?, actually is not posible, lets see at less for a heavy application, supose that you have photoshop or other application and all around the world is used by 2 million of computers (more or less?), supose that in x moment is connected to the server 25% of the total, and from that total 5% in x moment make use of the more extreme computation, that mean (guess my calculations) that the server in any moment can be impacted by give computations equally to 25,000 (or 12,500?) computers, dont know if the subscriptions can handle the buy for suchs a server ;).

Also I where going to the extreme, lets now look at more local level.

Supose that you buy fotoshop for servers (IRL) implemented for silicon graphics (or other good), and you buy 50 computers without much power at computations, but good resolution hardware and good mouse(or any that you whant for draw.. a pen), you save money for buy 50 silicon graphics computers, but you have the acuracy and power of the silicon graphics to be displayed in any of the other computers.

About the translation of data, in a LAN should not be much problems (pheraphs), taking that only 25% use actually the server and of the 25% only 10% use heavy computations, the silicon graphics will need (if I am not wrong) 0.625 of activity all the times. Pheraphs I am wrong in the calculation, but if is correct, with this type of activity you not overload much the silicon graphic computer.


Also a thing to point here is that the interpreter only display the interface and send the commands, but actually in fact dont do any operation over (or directly to) the data, it only receive the commands for represent the output that is IRL.


In that way, without being much paranoic server applications can be good ;), I think, no much problem pheraphs.. because you can continue plugin computers to the server, and the corporation will lose at less 50 installations ;).


Also yes, I see in this a return to the start, where there where only terminals, but yes, not all need by implemented like this way. What you think, being a designer (or some like that), you will buy a good graphic processor (dont know if silicon graphics is the right), and let interpret the others machines, the best that they can?, there are things, for example the representation of the commands will be near to what a silicon graphics computer can display?, the operations will be like if where the silicon graphics computer the one that do the operations (and if fact is the silicon graphics computer the one that is doing that), but the representation can be diferent... (if you whant to display in a monocrome, and is there a interpreter, it will try to interpre it the best that this computer can)


Have a nice day or night.
Posted on 2004-06-18 14:54:40 by rea
Then your server side app, sends your cutting-edge technology color corrector as a series of instructions to be executed by the client on the image.


What advantage could this possibly have above storing that same code locally (and let's not talk about storage)?
I would find it extremely annoying if I have to wait until the proper code is downloaded for every operation that I want to perform in a piece of software.

By the way, here is some info:

Well I suppose it depends on what the images are going to be used for, for example greeting cards or posters, but as a rule you would always exceed their requirements anyway. So for example a portrait image would have to be at least 10x8 (25.4x20.3) printable. We scan images into digital media at 48bit HDR Colour with a 1200dpi. this will result in a 10x8 (25.4x20.3) image being close on 610Mb. This example would then be saved either as a RAW or TIFF image with no compression. This allows the end user to rescale it to something they want to use. i.e if the scale it down to 600dpi it with be enlarged by 100% etc.
Direct digital images from the camera are transferred using RAW format called NEF which is Nikon's own RAW format, these are 12bit using a colour mode called sRGB. Camera images are quite small and around 9mb with a resolution of 3008x2000, the camera is 6.1 megapixel.


Clearly sending 610 mb to and fro is impossible, even with a gigabit LAN it would take quite some time (the HDD may become the bottleneck if the network is fast enough, but still you slow down a lot). Even the 9 mb image would already be reasonably troublesome for most internet users, even if they have broadband.

And the alternative of sending code through the browser to operate on the image makes no sense whatsoever. Why would I want to do that rather than storing the code locally, so I can use it anytime I like, without delays?
Posted on 2004-06-18 16:41:14 by Scali
you will buy a good graphic processor (dont know if silicon graphics is the right)


Silicon Graphics uses ATi chips these days. Might aswell buy a PC :)
Posted on 2004-06-18 16:43:39 by Scali
Yes, the low end SG's use ATi chipsets, the high end stuff uses their V12, not sure what it is based on. The thing about SG is that you are buying a complete bundled graphics system, MIPS processors, huge storage, the IRIX OS is dedicated to fast graphics at the expense of general purpose applications, and not to mention the proprietary software. But this argument was beaten to death a number of times and does not need to be resurrected yet another time.

Thanks for the info on the size of graphics files these days, it is interesting how much they have grown since Sid was doing that sort of thing. I used to visit his office in Toronto quite often and what is commonplace now was "WOW" stuff back then. I always liked the image of the eagle that came with the Targa cards, it was the "ultimate thing" at 32 bit 1024*768 :)
Posted on 2004-06-18 17:00:32 by donkey
No, the high end uses ATi chipsets. Their Onyx monster does.
Basically, SGI is overtaken by game PCs in the past few years.
As I said before, we upgraded some of our SGI visualization servers by regular P4s with regular 'game' cards simply because they were faster. SGI seems to have realized this and apparently buys their chips from ATi now, just like any PC manufacturer.
Posted on 2004-06-18 17:10:08 by Scali

No, the high end uses ATi chipsets. Their Onyx monster does.
Basically, SGI is overtaken by game PCs in the past few years.
As I said before, we upgraded some of our SGI visualization servers by regular P4s with regular 'game' cards simply because they were faster. SGI seems to have realized this and apparently buys their chips from ATi now, just like any PC manufacturer.


I would tend to agree with you there though it is certainly not my area of expertise (not much is :) ). Gaming PCs have made incredible leaps in technology over the years and in the same way that many dedicated systems were surpassed or equalled in terms of hardware I do not doubt that SG has been. But as I said it is an entire system aimed at graphics that you are buying not just the hardware, all else being equal SG has the best OS and software to get the job done.

It sort of reminds me of the DN550s (Apollo) that I liquidated once, they were 68000 based workstations and had an OS (Domain-IX) geared to graphics and desktop publishing. They all had some DTP software installed, can't remember the name but it was originally meant for VAX machines with a linotronix. Thing was that from a strictly hardware standpoint any MAC could blow them out of the water but the Apollo would run circles around them, it was dedicated to a single task and everything was designed for that task, from the hardware to the OS to the OS/Software interaction (API). It comes back to dedicated platforms being better suited to a task than general purpose PCs.
Posted on 2004-06-18 17:21:27 by donkey
But as I said it is an entire system aimed at graphics that you are buying not just the hardware, all else being equal SG has the best OS and software to get the job done.


Well yes, one major advantage is that it's not a PC, and not designed as such. So no AGP bus that makes CPU->GPU a one-way street.
Then again, how often do you need that when you have GPUs as advanced as we do today... So while SGI systems may still be better, most of the time you cannot justify their extra cost, because them being better is purely theoretical. As long as the GPU does most of the work, it doesn't really matter what OS, CPU, software, or whatever else is feeding it its data.
And since SGI doesn't have Direct3D, I would say that the OS/software support for PCs may actually be better at this time. Especially ATi GPUs are primarily designed for Direct3D, and the 9700 design was pretty much the blueprint for Direct3D 9.
Posted on 2004-06-18 17:30:01 by Scali
It is your area of expertise so I will definitely take you at your word, I am as I said no expert in this subject and I learn as I go along. Nice to have someone to explain it though.:alright:
Posted on 2004-06-18 17:34:42 by donkey