Wednesday, March 18, 2009

Sci Fi Channel and the Death of Science Fiction TV

Bye bye Sci Fi.

If you haven't heard, Sci Fi Channel has announced that they're changing their name to SyFy Channel in an attempt to broaden their audience. What we have to remember however, is this is about dollars, not geeks.

From TVWeek.com:
“We spent a lot of time in the ’90s trying to distance the network from science fiction, which is largely why it’s called Sci Fi,” said TV historian Tim Brooks, who helped launch Sci Fi Channel when he worked at USA Network.

Looking passed the fact that "sci-fi" and "science fiction" are synonyms in the vernacular (as much as Harlan Ellison hates it), this statement still doesn't quite jive with reality. Sci Fi Channel started in 1992 under an advisory board including Star Trek creator Gene Roddenberry, author Isaac Asimov and if my memory of the comic book ads is correct, actor Leonard Nimoy. A joint venture of Paramount Pictures, USA Networks, and Universal Pictures, the whole point of the channel was to rerun old science fiction shows and movies of which the studios had many. It was to be a celebration of science fiction although, I personally didn't know if there was enough applicable programming to support an entire channel. These days, you can catch an episode or series online or on DVD, but in those days, if you missed a show, it was possible that you'd never see it again. So at the time, the chance to see Twilight Zone, original Battlestar Galactica and Buck Rogers in the 25th Century was pretty darn cool. Unfortunately, we didn't get the channel at the time and I was relegated to reading about it in Wizard Magazine. Brooks' claim that they were trying to distance themselves doesn't seem to ring true until the 21st Century. Until that time they did run a lot of horror, but I still caught Star Trek, Space Above and Beyond and Babylon 5 through the turn of the century. Prime time on Friday nights (oft called SciFridays) was their most advertised lineup and has contained much of their original programming like Farscape, Stargate SG-1/Atlantis, Battlestar Galactica, Andromeda and The Secret Adventures Of Jules Verne plus, Dr Who (BBC). Sounds pretty science fictiony to me. Even some of their largest ventures were two massive Dune mini-series. Still, direct-to-video quality horror movies, paranormal programs like John Edwards and Ghost Hunters and reality shows were creeping in around 2000. And meanwhile, science fiction was disappearing from American networks.

Wednesday, January 7, 2009

Gaming Beyond DirectX?

I know a little about video game engines and such, but I thought I'd submit this to my readers to see if maybe you could help me with the gaps in my knowledge. As some may know, I'm not happy with DRM and resource heavy Windows Vista and I'm wondering what might happen in the PC gaming world when Microsoft stops selling XP. Mainly, the question is, can OpenGL completely replace DirectX 10+? Mac, Unix, GNU, Linux all use OpenGL, but looking into DirectX, it would seem it's more powerful than OpenGL when matched with the right GPU (Graphics Processing Unit.) DirectX 10 is supposed to be fast and streamlined, which is cool, but it's only on Vista. The thing is, both OpenGL and Direct X use the same Shader Model. Does that mean OpenGL has the same abilities? I speak mainly of the amazing texturing and displacement mapping now possible with engines like the Unreal Engine 3. From what I have read, that engine was built for DirectX 9 (and patched to 10) so, I don't know how viable OpenGL is on "next generation" platforms. Then again, Call of Duty 4 is on Intel Macs and doesn't seem to take a hit graphically (although I've never played it on an Mac so, I'm not an authority.) So, anyone have the skinny this stuff?