There's been a lot of coverage of Apple's "walled garden", or "curated computing" approach to the iPhone App Store. Most of it is negative, focusing on high profile rejections that appear to some to violate some fundamental freedom which Apple are denying their developers and users.
The three main issues are:
- Hobbyist programmers want to be able to write their own software and put it on their phone without Apple's blessing.
- Professional developers want to be able to sell their products to people without Apple's blessing.
- Professional developers want to be able to create their products in any platform, including Flash and other alternate frameworks.
There are several solutions which have been discussed ad infinitum: jailbreaking and web-app authoring, each of which allow you to run basically anything on your iPhone.
There are a few responses to the above which I haven't seen much discussion of:
- For hobbyists, the barrier to entry to becoming a "blessed" developer is just $99 and some paperwork. If you don't want to jailbreak your iPhone, then the cost of 10 albums on iTunes will permit you to do whatever you want on your own phone. And if you have a really great program, you can then publish it to the App Store and make your money back and more.
- Professional developers who want to be able to sell their programs without the App Store and without jailbreaking can do that today. They just need to sell it as source code/resource bundles that users could then compile themselves, and if they have a Mac and $99 to become a developer, they can put the programs on their own phones any time. You're afraid that people will redistribute your program and you'll lose sales? Well, then it sounds like you want the App Store's protection and you're free to go that route. But to say that there aren't options is disingenuous.
- Pros who want to build apps in a different framework should pressure the framework developers to build systems which output Objective-C code so that the developers can then build it with XCode and have an App Store-legal product on their hands. Of course, that's more work, but if it can be done, then it would be an invaluable tool for the end-product developers who could then get in and make modifications to enable features which are available on the iPhone but not supported in the framework.
I'm still working on my first iPhone app, and so I haven't submitted anything to the store, so take my words with whatever volumes of salt you feel are appropriate. But there are options for all involved parties. You're free to like or dislike Apple's policies. And you're free to vote with your dollars and publish your feelings in public fora.
But to make claims that Apple is somehow infringing on inalienable freedoms is disingenuous to say the least, and maybe even hyperbolic. You want freedom? Make some hard choices, do some extra work.
Remember, developers, Apple's main customers are the consumers who buy their products, not the developers who make a living off of them. They've opened up a huge market with the iPhone, iPod Touch and the iPad; if you want to make a living off that market, either follow the rules or be clever and original. Don't become the technical equivalent of a Tea Party "activist".
Thursday, June 3, 2010
Saturday, February 13, 2010
The Clash of the Social Titans
Love it or hate it, or anything in between, one can't ignore the fact that Google Buzz has generated a pretty sizable amount of buzz.
People have attacked it from all sides: some people feel that the way they introduced it, built on top of Gmail, was an invasion of their privacy; others say it's a poor imitation of Twitter; others still say that there's nothing innovative about it, as it looks a lot like FriendFeed, a service that has become part of FaceBook.
But what stands out to me is what people are not comparing it to. They are not comparing it to Second Life. They are not comparing it to MySpace. Or Friendster or any number of other social networks which were supposed to be the next great revolution in social networks at the time they came out.
Once upon a time, Second LIfe was an industry darling, with regular stories in the Times of people making thousands of dollars selling virtual dresses, performers like Jewel doing live virtual concerts, and enterprises like IBM setting up virtual customer support and intra-company conferencing in Linden Labs' fabricated world. It may not be a ghost town today, but at least the media coverage has moved on. IBM apparently still has their resource center there (http://www.ibm.com/3dworlds/businesscenter/us/en/), so there may still be some value to having property in Second Life, but it does not appear to be a growing concern.
Similarly, there was a time when if you wanted to reach the cool kids, you had to use MySpace. New movies, rock bands and more didn't have their own web pages, instead hosting all of their information on MySpace. They took social networking to new heights, and grew so fast that media conglomerate Fox News Corp bought them up, riddled the experience with advertisements, and then watched as their user base grew up and left, or just left for the new darling, FaceBook.
It's evolution, to some degree, but at the same time, it's just history repeating itself over and ove and over. The issue is really that these social networks are built around growth, but they don't have an end goal. And unfortunately, the features and behaviors that help grow a great social network do not necessarily make for a sustainable and useful social network. Eventually, with constant growth, the network becomes clogged and noisy, invasive and indistinguishable from the outside world. In its quest to become the ultimate way to stay in contact with your family, friends and colleagues, social networks tend to ignore their own initial value proposition, which is--at least implicitly--that using that network is a way to separate the wheat from the chaff. And in their rush to include every single grain of wheat, they inevitably start letting in a lot of chaff.
I wish I had a constructive solution, but at the moment all I see is the problem. Each network has its own strengths and weaknesses, and I anticipate that we'll see a few more before one surfaces with the right tools to allow people the access they desire with the controls they need. In the mean time, you can find me on Facebook, slowly culling friends from my list.
People have attacked it from all sides: some people feel that the way they introduced it, built on top of Gmail, was an invasion of their privacy; others say it's a poor imitation of Twitter; others still say that there's nothing innovative about it, as it looks a lot like FriendFeed, a service that has become part of FaceBook.
But what stands out to me is what people are not comparing it to. They are not comparing it to Second Life. They are not comparing it to MySpace. Or Friendster or any number of other social networks which were supposed to be the next great revolution in social networks at the time they came out.
Once upon a time, Second LIfe was an industry darling, with regular stories in the Times of people making thousands of dollars selling virtual dresses, performers like Jewel doing live virtual concerts, and enterprises like IBM setting up virtual customer support and intra-company conferencing in Linden Labs' fabricated world. It may not be a ghost town today, but at least the media coverage has moved on. IBM apparently still has their resource center there (http://www.ibm.com/3dworlds/businesscenter/us/en/), so there may still be some value to having property in Second Life, but it does not appear to be a growing concern.
Similarly, there was a time when if you wanted to reach the cool kids, you had to use MySpace. New movies, rock bands and more didn't have their own web pages, instead hosting all of their information on MySpace. They took social networking to new heights, and grew so fast that media conglomerate Fox News Corp bought them up, riddled the experience with advertisements, and then watched as their user base grew up and left, or just left for the new darling, FaceBook.
It's evolution, to some degree, but at the same time, it's just history repeating itself over and ove and over. The issue is really that these social networks are built around growth, but they don't have an end goal. And unfortunately, the features and behaviors that help grow a great social network do not necessarily make for a sustainable and useful social network. Eventually, with constant growth, the network becomes clogged and noisy, invasive and indistinguishable from the outside world. In its quest to become the ultimate way to stay in contact with your family, friends and colleagues, social networks tend to ignore their own initial value proposition, which is--at least implicitly--that using that network is a way to separate the wheat from the chaff. And in their rush to include every single grain of wheat, they inevitably start letting in a lot of chaff.
I wish I had a constructive solution, but at the moment all I see is the problem. Each network has its own strengths and weaknesses, and I anticipate that we'll see a few more before one surfaces with the right tools to allow people the access they desire with the controls they need. In the mean time, you can find me on Facebook, slowly culling friends from my list.
Saturday, February 6, 2010
iPad 3 - Niche Platform
It should come as no surprise that I think that the iPad is going to be a reasonable success. I am not going to be first in line to buy one, and probably won't even spring for the first revision or two. But I'm sure that I will own one, and I believe strongly that there will be a pretty good rush on the devices from a surprisingly wide range of people.
There are lots of great justifications as to why the product will either fail miserably or succeed beyond anyone's imagination. And there are as many declarations that it's a true revolution as there are that it's just a big iPhone or iPod Touch. But strangely enough, most of those rationales do not appeal to me; the one that jumps out is that last one, which is usually used to dismiss the iPad as a product.
I don't have an iPhone, but I do have an iPod Touch. But I don't use it to listen to music very often. It's not safe to do while driving or biking, it would get in the way of my work during the day, and it wouldn't be very good for my marriage if I listened to it while I was spending time with the family. My primary exercise is swimming, and while I considered looking into a waterproof set-up, the idea of strapping that thing to my trunks seems a little high maintenance. I sometimes listen to it while I'm walking my dog, but these days, time's so limited that instead I typically use that time to catch up with friends by phone.
So instead, the iPod mostly sits on my desk during the day, on the coffee table evenings and weekends, and on my bedside at night. It's my alarm clock, for one thing, but it's also just a really convenient way to check my personal email, twitter, calendar, train schedule, etc... during the day, and to do casual browsing and the like when I'm not at work. Pulling out my laptop for anything shy of a blog entry or actual coding seems like a waste of effort. If I'm going to use the laptop, it's a commitment. I can check my email on my iPod in about 10 seconds from the moment I pick the device up to the time I put it down.
Right from the start, I thought to myself that if only the iPod Touch were a few inches bigger, it'd make a great ebook reader, and it would be a much better browsing experience. I thought also that if I had such a device, I might not use my laptop except for work.
And now such a device exists, or will exist in a couple of months. I had been hoping for something a little smaller, more like 7" diagonal, with a smaller bezel, and something that would jack into a dock in order to be able to drive an external monitor and act like a normal Mac, reverting to a tablet when it was out of the dock. But while it's not my dream device, it is indeed something that I will use, because effectively, it is a big iPod Touch. If only my Touch didn't work so wonderfully, I'd consider buying and iPad right away, but unfortunately, it remains useful enough that I can't justify plunking down $629 in late March. Yes, I know it comes cheaper, but 3G would mean no worrying about hot spots...
And that brings me to the biggest effect that I think the iPad will have on the rest of the industry. Just like the iMac pushed USB into ubiquity, the iPad is going to be the thing that makes everyone expect 3G/WiMax radios on all mobile devices. Phones, laptops, netbooks and yes, tablets. Companies that make dongles should start thinking about their next business today.
There are lots of great justifications as to why the product will either fail miserably or succeed beyond anyone's imagination. And there are as many declarations that it's a true revolution as there are that it's just a big iPhone or iPod Touch. But strangely enough, most of those rationales do not appeal to me; the one that jumps out is that last one, which is usually used to dismiss the iPad as a product.
I don't have an iPhone, but I do have an iPod Touch. But I don't use it to listen to music very often. It's not safe to do while driving or biking, it would get in the way of my work during the day, and it wouldn't be very good for my marriage if I listened to it while I was spending time with the family. My primary exercise is swimming, and while I considered looking into a waterproof set-up, the idea of strapping that thing to my trunks seems a little high maintenance. I sometimes listen to it while I'm walking my dog, but these days, time's so limited that instead I typically use that time to catch up with friends by phone.
So instead, the iPod mostly sits on my desk during the day, on the coffee table evenings and weekends, and on my bedside at night. It's my alarm clock, for one thing, but it's also just a really convenient way to check my personal email, twitter, calendar, train schedule, etc... during the day, and to do casual browsing and the like when I'm not at work. Pulling out my laptop for anything shy of a blog entry or actual coding seems like a waste of effort. If I'm going to use the laptop, it's a commitment. I can check my email on my iPod in about 10 seconds from the moment I pick the device up to the time I put it down.
Right from the start, I thought to myself that if only the iPod Touch were a few inches bigger, it'd make a great ebook reader, and it would be a much better browsing experience. I thought also that if I had such a device, I might not use my laptop except for work.
And now such a device exists, or will exist in a couple of months. I had been hoping for something a little smaller, more like 7" diagonal, with a smaller bezel, and something that would jack into a dock in order to be able to drive an external monitor and act like a normal Mac, reverting to a tablet when it was out of the dock. But while it's not my dream device, it is indeed something that I will use, because effectively, it is a big iPod Touch. If only my Touch didn't work so wonderfully, I'd consider buying and iPad right away, but unfortunately, it remains useful enough that I can't justify plunking down $629 in late March. Yes, I know it comes cheaper, but 3G would mean no worrying about hot spots...
And that brings me to the biggest effect that I think the iPad will have on the rest of the industry. Just like the iMac pushed USB into ubiquity, the iPad is going to be the thing that makes everyone expect 3G/WiMax radios on all mobile devices. Phones, laptops, netbooks and yes, tablets. Companies that make dongles should start thinking about their next business today.
Tuesday, February 2, 2010
iPad 2 - Flash
With the release of the original iPhone, then the iPod Touch, each subsequent hardware and software revision of each, and now with the iPad, there has come a growing chorus of complainants who bemoan the lack of Flash support on Apple's multi-touch platform. And as this groups cries grow louder, the erstwhile defendants of Apple's platform decisions dig in deeper and deeper, entrenching themselves for the battle to end all battles. If the blogosphere is any indication, this is a huge and polarizing issue, and if modern-day politics are any issue, we all have to take sides, and for whatever side one takes, one can not give an inch.
As such, I am now going public with my stance: the iP{hone,od,ad} never has, does not and never will need Flash support. Further, anyone who feels that Apple will fail without it is clearly a Communist Nazi terrorist who doesn't return library books on time.
Joking aside, I am casting my hat in with Gruber et al, and suggesting that the absence of Flash is really not a huge deal for any of these platforms.
Flash has a great legacy: at the very least, we can thank it for getting the great majority of Java applets that were all the craze in the mid-to-late '90s off of our web pages. While those applets still have a place, they mostly felt tacked on, stuck into the web pages in which they were embedded, they took too much processing power, and they were often buggy. So long, dancing Star Trek Federation Insignia, we hardly new ye.
And for all the great points regarding Flash support made on both sides of the argument, I think that the key for me is that the bar has been raised with regards to seamless web experiences, and many Flash apps are what embedded Java applets were just a few years ago. They feel tacked on. They interrupt the flow of the page. Unless the app itself is the reason you're going to a page, finding Flash running--i.e. in an advertisement or a movie that auto-starts without the viewer requesting that it do so--is an annoyance and it has an adverse effect on the overall experience.
And just like Java applets went the way of the dinosaurs when Flash became the prevalent plug-in technology, the bell is tolling for Flash because HTML5 is on the way, and there's very little you can't do using HTML5 and Javascript with modern CSS transforms and animations, and built-in support for video. Sites all over the web are demonstrating the possibilities, with YouTube leading the way with their HTML5 version of the site that doesn't make my laptop's fan kick in.
There are lots of solutions to the problem. Maybe Adobe should open-source the technology and hope that the Mozilla, WebKit (Safari and Chrome) and Gecko (IE8+) frameworks adopt it, thus allowing them to leverage their position as a premier provider of content creation tools. I'm not sure if this is the best idea in the world, but it's a darn sight better than digging in and trying to hold onto a market position that is clearly slipping away.
As such, I am now going public with my stance: the iP{hone,od,ad} never has, does not and never will need Flash support. Further, anyone who feels that Apple will fail without it is clearly a Communist Nazi terrorist who doesn't return library books on time.
Joking aside, I am casting my hat in with Gruber et al, and suggesting that the absence of Flash is really not a huge deal for any of these platforms.
Flash has a great legacy: at the very least, we can thank it for getting the great majority of Java applets that were all the craze in the mid-to-late '90s off of our web pages. While those applets still have a place, they mostly felt tacked on, stuck into the web pages in which they were embedded, they took too much processing power, and they were often buggy. So long, dancing Star Trek Federation Insignia, we hardly new ye.
And for all the great points regarding Flash support made on both sides of the argument, I think that the key for me is that the bar has been raised with regards to seamless web experiences, and many Flash apps are what embedded Java applets were just a few years ago. They feel tacked on. They interrupt the flow of the page. Unless the app itself is the reason you're going to a page, finding Flash running--i.e. in an advertisement or a movie that auto-starts without the viewer requesting that it do so--is an annoyance and it has an adverse effect on the overall experience.
And just like Java applets went the way of the dinosaurs when Flash became the prevalent plug-in technology, the bell is tolling for Flash because HTML5 is on the way, and there's very little you can't do using HTML5 and Javascript with modern CSS transforms and animations, and built-in support for video. Sites all over the web are demonstrating the possibilities, with YouTube leading the way with their HTML5 version of the site that doesn't make my laptop's fan kick in.
There are lots of solutions to the problem. Maybe Adobe should open-source the technology and hope that the Mozilla, WebKit (Safari and Chrome) and Gecko (IE8+) frameworks adopt it, thus allowing them to leverage their position as a premier provider of content creation tools. I'm not sure if this is the best idea in the world, but it's a darn sight better than digging in and trying to hold onto a market position that is clearly slipping away.
Saturday, January 30, 2010
iPad 1 - Multitasking
Don't get me wrong: I'm a power user. I like gobs of RAM, lots of HD space, multi-core processors and fast networking. I like tuning my machine to run full tilt and then bringing it to its knees with the applications I run on a regular basis. Mixing down music I've recorded, ripping a DVD so I don't have to carry the disk on the plane, various and sundry development environments, virtual machines and terminals and browsers galore. I love forcing my machine to do my bidding.
But all this talk about the iPad failing because of missing multitasking is, well, just silly. The best example people can come up with is that you can't browse the web while listening to Pandora. But of course, you can browse the web while listening to music, since the iPad, effectively a big, 3G-enabled iPod Touch, puts music playback at the system level. Pandora may be great and everything, but I'd hardly call that a deal killer.
The iPad does indeed do multi-tasking. System level activity runs right alongside the apps that users install from the App Store. It just doesn't do application-level multitasking, and while that's something that I like to be able to do on my desktop, I'm not sure that it's such a big problem on a tablet-type device.
One of the great things about the iPhone Apps is that they are all required to shut down within five seconds if you hit the home key. Many--maybe most?--take that five seconds to save state so that when you come back in, you can continue what you were doing. And like it or not, the majority of the time, that's what most users are doing when they have multiple apps running on a desktop/laptop computer at the same time. They aren't necessarily interested in the background application doing significant processing while they interact with a foreground application. Instead, they leave multiple programs running so that they can move back and forth as they need to without re-establishing state. When you move from Outlook to Excel to Word, copying and pasting data, the programs you push to the background aren't doing anything special except taking up RAM and processor time while they are waiting for you to call on them again.
There are, of course, exceptions to this rule. If you're doing complex graphics rendering, or video encoding, or compiling code, you definitely want multitasking to allow you to continue to use your computer while work is being done in the background. For most cases like this, I'd simply say that the iPad is the wrong tool for the job. But for email, web browsing, some types of content creation and media consumption, it just may not be so important. Being able to move between apps which have saved their states upon exit may indeed be good enough.
I would be interested in seeing an OS feature which would allow you to easily switch with a gesture between the last three (or so) applications which have been run. If you're moving data from one program to another (i.e. the iWork suite) it might be nice to be able to jump from one to another without going "home", then going into another application, then back again. Dollars to donuts, if they had that feature, most people wouldn't know the difference between the fast switching and actual multitasking.
Again, don't get me wrong: I'm not about to try to install the iPhone/iPad OS on my desktops. But for these simple mobile devices, it's not a huge loss in my book. In fact, when balanced against the additional resources that would be required, and the effect multitasking could have on performance and battery life, I'm thinking that this is the right decision.
But all this talk about the iPad failing because of missing multitasking is, well, just silly. The best example people can come up with is that you can't browse the web while listening to Pandora. But of course, you can browse the web while listening to music, since the iPad, effectively a big, 3G-enabled iPod Touch, puts music playback at the system level. Pandora may be great and everything, but I'd hardly call that a deal killer.
The iPad does indeed do multi-tasking. System level activity runs right alongside the apps that users install from the App Store. It just doesn't do application-level multitasking, and while that's something that I like to be able to do on my desktop, I'm not sure that it's such a big problem on a tablet-type device.
One of the great things about the iPhone Apps is that they are all required to shut down within five seconds if you hit the home key. Many--maybe most?--take that five seconds to save state so that when you come back in, you can continue what you were doing. And like it or not, the majority of the time, that's what most users are doing when they have multiple apps running on a desktop/laptop computer at the same time. They aren't necessarily interested in the background application doing significant processing while they interact with a foreground application. Instead, they leave multiple programs running so that they can move back and forth as they need to without re-establishing state. When you move from Outlook to Excel to Word, copying and pasting data, the programs you push to the background aren't doing anything special except taking up RAM and processor time while they are waiting for you to call on them again.
There are, of course, exceptions to this rule. If you're doing complex graphics rendering, or video encoding, or compiling code, you definitely want multitasking to allow you to continue to use your computer while work is being done in the background. For most cases like this, I'd simply say that the iPad is the wrong tool for the job. But for email, web browsing, some types of content creation and media consumption, it just may not be so important. Being able to move between apps which have saved their states upon exit may indeed be good enough.
I would be interested in seeing an OS feature which would allow you to easily switch with a gesture between the last three (or so) applications which have been run. If you're moving data from one program to another (i.e. the iWork suite) it might be nice to be able to jump from one to another without going "home", then going into another application, then back again. Dollars to donuts, if they had that feature, most people wouldn't know the difference between the fast switching and actual multitasking.
Again, don't get me wrong: I'm not about to try to install the iPhone/iPad OS on my desktops. But for these simple mobile devices, it's not a huge loss in my book. In fact, when balanced against the additional resources that would be required, and the effect multitasking could have on performance and battery life, I'm thinking that this is the right decision.
Friday, August 14, 2009
On Racism.
Posted as a comment to this article on white privilege.
I enjoyed your paper, and what struck me is that with a gentle twist, it could have been about a lot of other topics. The situations you discussed certainly were prime examples of ignorance and fear, which are two important ingredients of racism, but it seems to me that the authors in question could have been in any other group and come up against similar challenges.
I guess what I'm saying is that even if the person getting victimized by ignorance and fear belongs to a racial/ethnic minority, I'm not sure that it's actually racism unless their race or ethnicity is the deciding factor.
What if the picture on the cover of Liar had instead been of a man? Would the publisher's excuse be any more acceptable? Having not read the book, I can only imagine that the character could also be lying about her gender. Would it have been worse if it had been a white man? Black? Latino?
With regards to DeShawn Days, the removal of "crack vials" sounds like a white-washing of a minority's childhood experience, but would a children's book detailing a parent's alcoholism or violent abuse of a spouse be published without any of the less-friendly imagery being edited out? I assume the same reasons--teachers and children's librarians wouldn't buy it--would be cited.
And Ahmed's struggle with the title is poignant, but it seems to me that the industry is rife with stories like this; mistakes from the people who run the presses which, fearing the cost of correction, they refuse to change. I remember a scene in Taxi Driver where Albert Brooks is at a campaign office complaining to a vendor that the buttons they sent said, "WE are the people" instead of "We ARE the people". I'm not saying that it does not matter, but neither the mistake nor the follow-up were necessarily racism even though they can clearly be deconstructed as such.
I don't think we've moved beyond race here in America or anywhere else, any more than we've moved beyond sexism, classism or any number of isms. But I do think that in some cases, the root causes of behaviors that seem racist (or *-ist) are of more importance than the perceived -isms.
The publishers are afraid that people won't like it if kids see a book of dark poetry including drug imagery. Maybe there's room for a publisher who is not afraid. If that latter publisher makes a financial killing, then the more staid publisher will follow.
I assume that authors have contracts with regards to publication before those proofs come out. From what I hear, JK Rowling has had complete creative control since day one of the Harry Potter franchise. If an author doesn't like what a publisher is doing, they should be able to take their work and walk.
I don't have a solution for everything, but I think it may not be constructive to assume that, when there's a conflict between two people of different races--even when cultural differences exacerbate the conflict--that the core problem is racism and racism alone.
I enjoyed your paper, and what struck me is that with a gentle twist, it could have been about a lot of other topics. The situations you discussed certainly were prime examples of ignorance and fear, which are two important ingredients of racism, but it seems to me that the authors in question could have been in any other group and come up against similar challenges.
I guess what I'm saying is that even if the person getting victimized by ignorance and fear belongs to a racial/ethnic minority, I'm not sure that it's actually racism unless their race or ethnicity is the deciding factor.
What if the picture on the cover of Liar had instead been of a man? Would the publisher's excuse be any more acceptable? Having not read the book, I can only imagine that the character could also be lying about her gender. Would it have been worse if it had been a white man? Black? Latino?
With regards to DeShawn Days, the removal of "crack vials" sounds like a white-washing of a minority's childhood experience, but would a children's book detailing a parent's alcoholism or violent abuse of a spouse be published without any of the less-friendly imagery being edited out? I assume the same reasons--teachers and children's librarians wouldn't buy it--would be cited.
And Ahmed's struggle with the title is poignant, but it seems to me that the industry is rife with stories like this; mistakes from the people who run the presses which, fearing the cost of correction, they refuse to change. I remember a scene in Taxi Driver where Albert Brooks is at a campaign office complaining to a vendor that the buttons they sent said, "WE are the people" instead of "We ARE the people". I'm not saying that it does not matter, but neither the mistake nor the follow-up were necessarily racism even though they can clearly be deconstructed as such.
I don't think we've moved beyond race here in America or anywhere else, any more than we've moved beyond sexism, classism or any number of isms. But I do think that in some cases, the root causes of behaviors that seem racist (or *-ist) are of more importance than the perceived -isms.
The publishers are afraid that people won't like it if kids see a book of dark poetry including drug imagery. Maybe there's room for a publisher who is not afraid. If that latter publisher makes a financial killing, then the more staid publisher will follow.
I assume that authors have contracts with regards to publication before those proofs come out. From what I hear, JK Rowling has had complete creative control since day one of the Harry Potter franchise. If an author doesn't like what a publisher is doing, they should be able to take their work and walk.
I don't have a solution for everything, but I think it may not be constructive to assume that, when there's a conflict between two people of different races--even when cultural differences exacerbate the conflict--that the core problem is racism and racism alone.
Wednesday, August 5, 2009
Sam I Ain't
Y'know, I was reading Green Eggs and Ham to my son (see icon) the other day, and I realized that the most important message wasn't one of being foodventurous (cool term, BTW). It's a message of identity and attachment.
We all know the name of the guy pushing the GE&H: Sam I Am, right? And everyone else in the story is identified as *something*, at the very least--box, fox, house, mouse, train--except for our protagonist. He's got no name. His species is indeterminate. We don't know anything about who or what he is, only that he won't eat GE&H, and the fact that he's annoyed by SIA.
And of course, we learn through the story that he is deeply, strongly identified with the fact that he hates GE&H, even though he's never tried it. It's an attachment that keeps him from living his life, because he spends all but two of the pages of the book--the beginning, where he expresses his disdain for SIA, and the end, where he expresses his love for GE&H--defending that (non-) identity.
And at the end, after wasting time and energy, after causing a train rain boat moat disaster, he ends up not only trying GE&H--and look at the defeated look on his face when he does so--but actually LOVING it, declaring all that had been true to suddenly be untrue. It was a total personality shift.
Except for one thing. He doesn't say that he loves SIA, and based on the fact that he hated SIA right from the start, this isn't the first time that they've had this interaction, and it won't be the last. And based on the fact that SIA never loses his faith in his ability to convert no-name, my guess is that other interactions have ended up the same way, with no-name clinging to some shred of identity which still, pages later, slips away.
The lesson, I told my son, is not to be willing to try anything, but to know himself and to be wary of the motivations of others who are trying to influence you. Sam I Am could have been a pusher, saying that no-name needed to try Giggly Extacy and Heroin, or God, Ecclesiastes and Heffalumps and no-name let himself be worn down. Of course he was happy with the new experience which turned out be not as bad as he had made it out to be (how could it have been?), but having lost that last shred of identity, his soul was that much more empty and easy for SIA to claim the next time around.
We all know the name of the guy pushing the GE&H: Sam I Am, right? And everyone else in the story is identified as *something*, at the very least--box, fox, house, mouse, train--except for our protagonist. He's got no name. His species is indeterminate. We don't know anything about who or what he is, only that he won't eat GE&H, and the fact that he's annoyed by SIA.
And of course, we learn through the story that he is deeply, strongly identified with the fact that he hates GE&H, even though he's never tried it. It's an attachment that keeps him from living his life, because he spends all but two of the pages of the book--the beginning, where he expresses his disdain for SIA, and the end, where he expresses his love for GE&H--defending that (non-) identity.
And at the end, after wasting time and energy, after causing a train rain boat moat disaster, he ends up not only trying GE&H--and look at the defeated look on his face when he does so--but actually LOVING it, declaring all that had been true to suddenly be untrue. It was a total personality shift.
Except for one thing. He doesn't say that he loves SIA, and based on the fact that he hated SIA right from the start, this isn't the first time that they've had this interaction, and it won't be the last. And based on the fact that SIA never loses his faith in his ability to convert no-name, my guess is that other interactions have ended up the same way, with no-name clinging to some shred of identity which still, pages later, slips away.
The lesson, I told my son, is not to be willing to try anything, but to know himself and to be wary of the motivations of others who are trying to influence you. Sam I Am could have been a pusher, saying that no-name needed to try Giggly Extacy and Heroin, or God, Ecclesiastes and Heffalumps and no-name let himself be worn down. Of course he was happy with the new experience which turned out be not as bad as he had made it out to be (how could it have been?), but having lost that last shred of identity, his soul was that much more empty and easy for SIA to claim the next time around.
Subscribe to:
Posts (Atom)