cloudnine
Jul 14, 04:08 PM
To charge $1800 for a system that only has 512MB is a real disappoitment. 1GB RAM oughta be standard, especially with Leopard being on the horizon.
Unless the Xeon is that expensive (which I can't see how it would be), I don't see that as anything except creating some seperation between the configurations.
I agree... my buddy got a macbook pro and it came standard with 512mb of ram. For the first 3 or 4 days, he thought he purchased a defective notebook, it ran so badly. Opening MS Office applications literally took minutes, and that was with nothing else open. He took it back into the Apple store and the rep told him that his problem was his ram, so he purchased another 1gb (1.5gb total), and now it runs perfectly. You'd think that with all of these intel machines being released and a huge selection of software not being Universal yet, that 1 gig of ram would be standard...
kinda a$$h0lish if you ask me. :mad:
Unless the Xeon is that expensive (which I can't see how it would be), I don't see that as anything except creating some seperation between the configurations.
I agree... my buddy got a macbook pro and it came standard with 512mb of ram. For the first 3 or 4 days, he thought he purchased a defective notebook, it ran so badly. Opening MS Office applications literally took minutes, and that was with nothing else open. He took it back into the Apple store and the rep told him that his problem was his ram, so he purchased another 1gb (1.5gb total), and now it runs perfectly. You'd think that with all of these intel machines being released and a huge selection of software not being Universal yet, that 1 gig of ram would be standard...
kinda a$$h0lish if you ask me. :mad:
NY Guitarist
Apr 5, 08:47 PM
Everything else you said is all well and good, but why on earth would anyone need to download a 4K movie?
4K is coming sooner than later. Youtube has 4K media, of course it looks bad because of the YT compression penalty.
4K displays are coming too, both computer monitors and home theater.
4K is coming sooner than later. Youtube has 4K media, of course it looks bad because of the YT compression penalty.
4K displays are coming too, both computer monitors and home theater.
Virtualball
Apr 19, 02:32 PM
It appears from the F700's standpoint though the natural progression became TouchWiz.
Wrong. Just because a company released one phone that has a similar look as the iPhone doesn't mean their current offerings are a progression of that phone. It's a true testament as to who browses this forum if you honestly think that. The F700 didn't run an advanced OS, so it probably ran Symbian or used BREW. That means all Samsung did was create a theme. How does a theme they made 3 years prior to the Galaxy S mean it's a progression on the coding and UI they built? It doesn't. Here's a list of every Samsung phone: http://en.wikipedia.org/wiki/Category:Samsung_mobile_phones Now, pick out one of those and say it inspired all of their new devices 3 years later.
The F700 was an iPhone clone with a keyboard. It's depressing that people are saying that the iPhone copied its own clone.
Wrong. Just because a company released one phone that has a similar look as the iPhone doesn't mean their current offerings are a progression of that phone. It's a true testament as to who browses this forum if you honestly think that. The F700 didn't run an advanced OS, so it probably ran Symbian or used BREW. That means all Samsung did was create a theme. How does a theme they made 3 years prior to the Galaxy S mean it's a progression on the coding and UI they built? It doesn't. Here's a list of every Samsung phone: http://en.wikipedia.org/wiki/Category:Samsung_mobile_phones Now, pick out one of those and say it inspired all of their new devices 3 years later.
The F700 was an iPhone clone with a keyboard. It's depressing that people are saying that the iPhone copied its own clone.
-SD-
Aug 17, 02:44 PM
According to Sony's Gamecom press conference, GT5 is coming to Europe on Wednesday 3rd November (http://www.joystiq.com/2010/08/17/gran-turismo-5-arriving-in-europe-on-november-3/), the day after its US release.
:apple:
:apple:
logandzwon
Apr 19, 02:51 PM
The First Commercial GUI
http://img62.imageshack.us/img62/5659/star1vg.gif
Xerox's Star workstation was the first commercial implementation of the graphical user interface. The Star was introduced in 1981 and was the inspiration for the Mac and all the other GUIs that followed.
http://img217.imageshack.us/img217/7892/leopardpreviewdesktop4.jpghttp://img714.imageshack.us/img714/5733/xerox8010star.gif
-The Star was not a commercial product. Xerox didn't sell them. (Well eventually they did, but not as PCs. they were to be similar to what we'd call a terminal today.)
-the middle image is actually of an Apple Lisa. I think you were just showing as a comparison, but some people might think your saying it's a Star. It's not. It's a Lisa.
-Apple compensated Xerox for the ideas borrowed from the Star. SJ and the mac team were already working on the GUI before any of them ever saw the Star though. Also, Macintosh 1 wasn't a copy of the Star. In fact a lot of the stables of a modern GUI today were innovated by Apple for the Macintosh.
http://img62.imageshack.us/img62/5659/star1vg.gif
Xerox's Star workstation was the first commercial implementation of the graphical user interface. The Star was introduced in 1981 and was the inspiration for the Mac and all the other GUIs that followed.
http://img217.imageshack.us/img217/7892/leopardpreviewdesktop4.jpghttp://img714.imageshack.us/img714/5733/xerox8010star.gif
-The Star was not a commercial product. Xerox didn't sell them. (Well eventually they did, but not as PCs. they were to be similar to what we'd call a terminal today.)
-the middle image is actually of an Apple Lisa. I think you were just showing as a comparison, but some people might think your saying it's a Star. It's not. It's a Lisa.
-Apple compensated Xerox for the ideas borrowed from the Star. SJ and the mac team were already working on the GUI before any of them ever saw the Star though. Also, Macintosh 1 wasn't a copy of the Star. In fact a lot of the stables of a modern GUI today were innovated by Apple for the Macintosh.
Bill McEnaney
Apr 29, 10:03 AM
Labelling birthers as racist, paranoid, or nutters is just pandering to the distraction of all this. The persistence of this "issue" could be more of a logical desire to belittle or erode the political power of the current president; which is akin to schoolyard gossiping, sure, but it's still strangely effective.
The name-calling is childish, too. In fact, you usually hear it from left-wingers.
The name-calling is childish, too. In fact, you usually hear it from left-wingers.
wizz0bang
Jul 20, 09:57 AM
Bring on the multi-core GPUs! :)
whatever
Sep 13, 12:41 PM
All the people that just coughed up $3k for a quad core MacPro.
I'm one of those people who dropped $4K for a quad core MacPro and basically I'm happy that I did. It blows away everything else that is out there today and will be the top performing Mac until 2007. Apple will not be releasing an upgrade to the Mac Pro this year. No matter what anyone says.
Why you might ask, well they don't need to!
But what if the competition releases these super fast machines, won't Apple be left behind. No! What OS will these machines be running, Windows XP. One of the things that seperates Apple from everyone else is their OS. They have an OS which takes full advantage (important word is full) of the hardware. It's the big advantage that they have over Dell and HP, they create the software that runs on the computer.
So if I want to run Final Cut Pro as fast as possible on an optiomized machine, then I'll have to run it on a Mac. Alright, that's a bad example, but in a way it's not, because a lot of the people buying Mac Pros also live in Apple's Pro apps.
The next new computer we'll see from Apple anytime soon will be the MacBook Pro which will be redesigned (featuring the MacBook's keyboard), upgrades to the MacBook won't happen until January (however Apple may try to get them out in December).
Apple's goal is to have everything 64-Bit before Leopard is uncaged.
I'm one of those people who dropped $4K for a quad core MacPro and basically I'm happy that I did. It blows away everything else that is out there today and will be the top performing Mac until 2007. Apple will not be releasing an upgrade to the Mac Pro this year. No matter what anyone says.
Why you might ask, well they don't need to!
But what if the competition releases these super fast machines, won't Apple be left behind. No! What OS will these machines be running, Windows XP. One of the things that seperates Apple from everyone else is their OS. They have an OS which takes full advantage (important word is full) of the hardware. It's the big advantage that they have over Dell and HP, they create the software that runs on the computer.
So if I want to run Final Cut Pro as fast as possible on an optiomized machine, then I'll have to run it on a Mac. Alright, that's a bad example, but in a way it's not, because a lot of the people buying Mac Pros also live in Apple's Pro apps.
The next new computer we'll see from Apple anytime soon will be the MacBook Pro which will be redesigned (featuring the MacBook's keyboard), upgrades to the MacBook won't happen until January (however Apple may try to get them out in December).
Apple's goal is to have everything 64-Bit before Leopard is uncaged.
ethana
Mar 22, 12:53 PM
Blackberry playbook = The IPad 2 killer - you heard it here first.
Look at the specs, their greater or equal to the iPad 2 with the exception of battery life.
Uhhh... screen size?
Look at the specs, their greater or equal to the iPad 2 with the exception of battery life.
Uhhh... screen size?
aricher
Sep 13, 09:31 AM
Are these processors 32 or 64 bit? I told one of my PC-lovin' IT guys about the 8 core Mac this morning and he said, "32 bit processors are ancient technology no matter how many you stuff into a box, but I guess they are OK for entertainment computers." :rolleyes:
Joshuarocks
Apr 8, 12:17 AM
retail sucks dookey.. and Best Buy or Worst Buy can go out of business for all I care
Silentwave
Aug 26, 10:42 PM
I agree with you wholeheartedly.
But, I guess they COULD have put a pentium d in them...didnt they have dual cores?
yes, but they were significantly hotter, consumed much more power, and worst of all were incredibly inefficient per clock versus C2D. If memory serves, when the Conroe/Allendale (the codename for C2D desktop chips under 2.4GHz with 2MB L2) benchmarks first came out after the NDA lifted, the best Pentium Extreme Edition (3.73GHz Pentium D Presler core, dual core, 2x2MB L2, 1066 FSB, 130W TDP) was in many of the tests at least equaled by the Core 2 Duo E6300, a chip with the following specs:
Speed: 1.86 GHz Dual core
2MB L2 Cache
1066 MT/S FSB
TDP 65W
So a much slower, far cheaper C2D chip matches the best Pentium D Extreme Edition, though both are dual-core, have the same FSB speed, the Pentium D has a bigger L2 Cache, and each core is clocking at twice the speed of the Core 2 chip.
The C2D chips with the sole exception of the Core 2 Extreme X6800 version have a TDP of 65W- HALF that of the Pentium D series. Even the X6800 only has an 80W TDP.
To give you an idea of pricing, the *retail* version of the Core 2 Duo 1.86GHz chip at Newegg is listed at $193.
The retail version of the Pentium Extreme Edition dual core 3.73GHz chip at Newegg is listed at $1,015.
The rest of the Pentium D line has been dropped in price significantly since Core 2 Duo came out, its almost a fire sale. then again, they are much hotter, less efficient processors by far.
But, I guess they COULD have put a pentium d in them...didnt they have dual cores?
yes, but they were significantly hotter, consumed much more power, and worst of all were incredibly inefficient per clock versus C2D. If memory serves, when the Conroe/Allendale (the codename for C2D desktop chips under 2.4GHz with 2MB L2) benchmarks first came out after the NDA lifted, the best Pentium Extreme Edition (3.73GHz Pentium D Presler core, dual core, 2x2MB L2, 1066 FSB, 130W TDP) was in many of the tests at least equaled by the Core 2 Duo E6300, a chip with the following specs:
Speed: 1.86 GHz Dual core
2MB L2 Cache
1066 MT/S FSB
TDP 65W
So a much slower, far cheaper C2D chip matches the best Pentium D Extreme Edition, though both are dual-core, have the same FSB speed, the Pentium D has a bigger L2 Cache, and each core is clocking at twice the speed of the Core 2 chip.
The C2D chips with the sole exception of the Core 2 Extreme X6800 version have a TDP of 65W- HALF that of the Pentium D series. Even the X6800 only has an 80W TDP.
To give you an idea of pricing, the *retail* version of the Core 2 Duo 1.86GHz chip at Newegg is listed at $193.
The retail version of the Pentium Extreme Edition dual core 3.73GHz chip at Newegg is listed at $1,015.
The rest of the Pentium D line has been dropped in price significantly since Core 2 Duo came out, its almost a fire sale. then again, they are much hotter, less efficient processors by far.
MacinDoc
Aug 27, 03:23 AM
the iMac will get a conroe. Nothing can be as dumb as putting a laptop chip in the desktop iMac. If the iMac could hold a G5 in it, it sure can hold a Conroe chip.
So, does a dual core Conroe produce less heat than a G5? Remember, the iMac is essentially a laptop form factor, so heat dissipation is more difficult. I agree, though, it will get a Conroe chip, as long as it is cool enough.
So, does a dual core Conroe produce less heat than a G5? Remember, the iMac is essentially a laptop form factor, so heat dissipation is more difficult. I agree, though, it will get a Conroe chip, as long as it is cool enough.
Squire
Jul 15, 06:10 AM
For what it's worth, Alienware's top-of-the-line ALX series desktops (actually, all of their desktops, I believe) have the power supply at the top, too. I know some will scoff but they are lauded for their gaming performance and they brag about their cooling technology.
-Squire
-Squire
Consultant
Mar 25, 10:44 PM
So is there real resolution independence or just a x2 mode?
jclardy
Mar 26, 11:14 AM
This. Until this happens displays won't advance any further for actual computers (non-tablet) because there are so many form factors.
Apple can spend the time to make graphics for each flavor of iPhone or iPad because there aren't that many to deal with. It becomes a lot more difficult to do this across a large range of products. Besides, computers are getting to the point where they are too powerful for most users (hence the popularity of the iPad). A retina display option would give people more incentive to upgrade their desktops, laptops, etc. I think?
As a designer, I'd love a retina 27" ACD. 300dpi right on my screen, almost perfect. Now if we could just get the color/brightness a little more accurate...
I really don't see the point of a display anywhere near 300DPI for a desktop or laptop. My MBP 15" with the 1680x1050 display has a DPI of 128, and with this I can only see the pixels of the fonts if my face is 6" away from the screen, which is above the keyboard. If you have a monitor on a desk it is going to be at least a foot away, but probably more like 1.5-2 feet.
Some of Apples displays are still around 90-100 DPI which I could see upgrading from those to around 150 or so. The main reason they aren't doing it right now is because the menu bar and all other interface elements would be tiny. On my MBP they are already pretty small along with all the default fonts and that is only at 128DPI.
So some kind of resolution independence is necessary, I am hoping for a general fix and not just a retina display fix (2x) because there will be no in between. With a general fix they could implement a slider that allows you to resize everything to fit any resolution.
But back on topic, I am pretty surprised if this is true. I guess they are pushing for a summer release, but I guess they could be pretty much feature complete by now and just need to work out bugs.
Apple can spend the time to make graphics for each flavor of iPhone or iPad because there aren't that many to deal with. It becomes a lot more difficult to do this across a large range of products. Besides, computers are getting to the point where they are too powerful for most users (hence the popularity of the iPad). A retina display option would give people more incentive to upgrade their desktops, laptops, etc. I think?
As a designer, I'd love a retina 27" ACD. 300dpi right on my screen, almost perfect. Now if we could just get the color/brightness a little more accurate...
I really don't see the point of a display anywhere near 300DPI for a desktop or laptop. My MBP 15" with the 1680x1050 display has a DPI of 128, and with this I can only see the pixels of the fonts if my face is 6" away from the screen, which is above the keyboard. If you have a monitor on a desk it is going to be at least a foot away, but probably more like 1.5-2 feet.
Some of Apples displays are still around 90-100 DPI which I could see upgrading from those to around 150 or so. The main reason they aren't doing it right now is because the menu bar and all other interface elements would be tiny. On my MBP they are already pretty small along with all the default fonts and that is only at 128DPI.
So some kind of resolution independence is necessary, I am hoping for a general fix and not just a retina display fix (2x) because there will be no in between. With a general fix they could implement a slider that allows you to resize everything to fit any resolution.
But back on topic, I am pretty surprised if this is true. I guess they are pushing for a summer release, but I guess they could be pretty much feature complete by now and just need to work out bugs.
shawnce
Aug 6, 11:33 AM
Mac OS X Leopard
Introducing Vista 2.0
http://www.flickr.com/photo_zoom.gne?id=207241438&size=l
I bet we gonna get some good t-shirts this year like we did back when Tiger was announced ("Introducing Longhorn").
Introducing Vista 2.0
http://www.flickr.com/photo_zoom.gne?id=207241438&size=l
I bet we gonna get some good t-shirts this year like we did back when Tiger was announced ("Introducing Longhorn").
Digital Skunk
Apr 7, 07:27 AM
Everything depends on your work and needs right? For me...I'm short format and tweak every frame.
In terms of full disclosure I own FCP 4 suite and CS 5 master suite and own all the major Apple products (hardware and software). I also run Windows 7 in bootcamp.
Short format work is all about After Effects. Motion is 5 years behind and offers an incomplete feature set in comparison. After Effects marries up well with the tools from big 3d players, like Maxon and C4D. Its a great pipeline.
I'll watch with interest the announcements next week, but the release of an "iMovie Pro" won't interest me...and it seems like that's where Apple is headed. They now are fixated on Consumers Lite and Consumers Plus.
Apple is also doing everything to push me away from it's platform, with it's anti-Flash crusade, and it's complete inability to support Any (I mean ANY of the top 5-7) professional GPUs.
For the serious Pro Apple is living on borrowed time and the Steve Jobs reality-distortion field is weakening. Redmond is calling. Increasingly serious content professionals are listening. I never imagined these words coming from my mouth. But it's the truth.
Coming from a full-time, multimedia/journalism/photography/etc professional I have to totally and completely
AGREE!
I've seen a huge decline in Apple's interest in the professional market, and I don't even mean high end pro, we're talking SMB and SOHO type stuff here. The last revision of FCP was just not worth it unless you were buying new or buying to ensure you didn't have any left over bugs.
Avid Media Composer and Premier have gained massive leads on FCP in terms of workflow and speed. Once the younger college students start seeing how fast they can delivery a product with Adobe or Avid, they'll start wondering why the small houses switched to FCP in the first place, and start wanting to learn what the industry is working with . . . Avid, After Effects, ProTools, etc. And the iMovie Pro will be left to indie filmmakers and consumers with deep pockets
** disclaimer ** I have nothing against the indie segment . . . I am in it and love it. But Apple makes it harder with every update to justify staying with a company that has too much on it's plate, and not enough staff to keep up with the rest of the market.
Apple will always claim that "no one's buying it" rather than, "we didn't make it marketable and desirable" when they go to axe some hardware or software title.
In terms of full disclosure I own FCP 4 suite and CS 5 master suite and own all the major Apple products (hardware and software). I also run Windows 7 in bootcamp.
Short format work is all about After Effects. Motion is 5 years behind and offers an incomplete feature set in comparison. After Effects marries up well with the tools from big 3d players, like Maxon and C4D. Its a great pipeline.
I'll watch with interest the announcements next week, but the release of an "iMovie Pro" won't interest me...and it seems like that's where Apple is headed. They now are fixated on Consumers Lite and Consumers Plus.
Apple is also doing everything to push me away from it's platform, with it's anti-Flash crusade, and it's complete inability to support Any (I mean ANY of the top 5-7) professional GPUs.
For the serious Pro Apple is living on borrowed time and the Steve Jobs reality-distortion field is weakening. Redmond is calling. Increasingly serious content professionals are listening. I never imagined these words coming from my mouth. But it's the truth.
Coming from a full-time, multimedia/journalism/photography/etc professional I have to totally and completely
AGREE!
I've seen a huge decline in Apple's interest in the professional market, and I don't even mean high end pro, we're talking SMB and SOHO type stuff here. The last revision of FCP was just not worth it unless you were buying new or buying to ensure you didn't have any left over bugs.
Avid Media Composer and Premier have gained massive leads on FCP in terms of workflow and speed. Once the younger college students start seeing how fast they can delivery a product with Adobe or Avid, they'll start wondering why the small houses switched to FCP in the first place, and start wanting to learn what the industry is working with . . . Avid, After Effects, ProTools, etc. And the iMovie Pro will be left to indie filmmakers and consumers with deep pockets
** disclaimer ** I have nothing against the indie segment . . . I am in it and love it. But Apple makes it harder with every update to justify staying with a company that has too much on it's plate, and not enough staff to keep up with the rest of the market.
Apple will always claim that "no one's buying it" rather than, "we didn't make it marketable and desirable" when they go to axe some hardware or software title.
patrick0brien
Sep 13, 01:37 PM
I smell it an option for Rev. B.
As Mac Daily News says: "Mac Pro Octo-Core. For when you absolutely, positively have to sequence the entire human genome before lunch."
Naaaaaaaaaaaaaaarrrrrrrrrrrrrrr!
As Mac Daily News says: "Mac Pro Octo-Core. For when you absolutely, positively have to sequence the entire human genome before lunch."
Naaaaaaaaaaaaaaarrrrrrrrrrrrrrr!
0815
Apr 27, 08:50 AM
Maybe this will stop the large daily 1am data chunks being sent on 3G??? My most active time on 3G data always happens when I am asleep....:eek:
Sleep walking a lot lately?
Sleep walking a lot lately?
dsnort
Mar 31, 09:03 PM
[SIZE=1] The very fact that the Gingerbread source is available has given my Orange UK branded ZTE Blade Gingerbread before other phones had official builds.
Could you re-write the sentence so that it has a subject and a predicate?
Could you re-write the sentence so that it has a subject and a predicate?
bibbz
Jun 9, 11:41 PM
You mean Wal-Mart or something else? I've never heard of Wally World. Is that a chain back east? :confused:
Walmart, lol
Walmart, lol
4God
Jul 14, 02:32 PM
If true, these definitely would be powerful machines, however for people like myself, the power and resulting price tag will be simply too much to justify. Leave the Xeons for the PowerMacs, but introduce some mini-tower machines with Conroe chips - they would fit nicely between the iMac and PowerMac. For me, the Mac mini isn't enough, the iMac is great, however non-upgradeable. I'd like something upgradeable, where I could replace/upgrade HDDs, optical drives, and most importantly the display - yet a PowerMac is overkill for my needs. It sure would be nice to see, but I doubt Apple will do it... :cool:
Well said, I agree with you. Apple, IMHO, needs an "inbetween" machine for upgradablity. This would shorten the gap between consmumer and prosumer.
Well said, I agree with you. Apple, IMHO, needs an "inbetween" machine for upgradablity. This would shorten the gap between consmumer and prosumer.
rwilliams
Mar 22, 01:13 PM
This is just a preview of the future, Android based tablets will clean the iPads clock. Apple made the so-called iPad 2 as a 1.5. Low res camera, not enough RAM, and low res screen. It's going to be a verrrry long 2012 for Apple. Sure it's selling like hot cakes now, but when buyers see tablets that they don't have to stand inline for, that have better equipment and are cheaper ... Apples house of cards will come crashing down around them.
The only strength that Apple has is the app ecosystem; which is why they are going after Amazon for spiting on the sidewalk. They know the world of hurt coming their way.
Well, you knew it was only a matter of time before this cat showed up.
The only strength that Apple has is the app ecosystem; which is why they are going after Amazon for spiting on the sidewalk. They know the world of hurt coming their way.
Well, you knew it was only a matter of time before this cat showed up.
No comments:
Post a Comment