Heini van Bergen
Passionate about Internet. Loves Online Marketing, SEO, SEM, Affiliate marketing, Email marketing, Apple, Lonneke & Tim
Contact me to find out how I can turn my passion into benefit for your organization.
- Online Marketing
- Google Adwords
- Google Analytics
- Affiliate Marketing
- Email Marketing
- Project management
- People management
- Master Business administration (MBA)
- Masterclasses PHP Web developer (Architecture, CVS, OOP)
- DNS administration
- HEAO-CE (Sports management) BC degree
Woon toch net niet dicht genoeg bij de grens om via een Duits netwerk te stemmen. #esf
Het beste wordt niet altijd tot het laatst bewaard #esf
Na Bonnie Tyler nu ook Axl Rose met een comeback op #esf ?
RT @rachidfinge: Wij hebben gewonnen! Zelfs al zouden we niet het meeste aantal punten halen vanavond. #esf
En nu met foto http://t.co/wl5lvhxxz1
Onderweg naar de @jcfoundation cup met de vip bus
RT @TNWinsider: Everything announced at the Google I/O 2013 keynote in one handy list http://t.co/lPEyBZPG8D by @margotlily
RT @hellemans: LOL! "@inafried: Next version of Google photo editing: If your baby isn't smiling in the photo, a Googler will step in and t…
RT @caricevhouten: Haha “@sebastianlenton: Daft Punk, without their helmets: http://t.co/8M15kMKASi”
: "I Screwed Up: How 3 Famed Entrepreneurs Learned From Failure | http://t.co/QeBRSaAOPD" http://t.co/KCQyisb0Ki
Seo moz: "How to Rank: 25 Step SEO Master Blueprint" http://t.co/9sDBAB2eQH
@m4tthijs don't tweet and drive
Goh, Jan Smit die een liedje simpel noemt.. #ESF
Leuke reus #ESF
Blijft een vreemde poppenkast dat #ESF
RT @Sywert: Zo krijgt 65% van huishoudens een zorgtoeslag. En 70% Nederlanders ontvangt huurtoeslag, zorgtoeslag of kinderopvangtoeslag. Ro…
RT @mattcutts: Pretty much every SEO should watch this video: http://t.co/plxx9WPqDl (unless you prefer surprises)
vrijdag deelnemen aan @jcfoundation cup in Olympisch stadion. Leuk om te zien dat @legalexperience ook een team afgevaardigd heeft
Shared by Heini
What is a Reverse Proxy and how can it help my SEO? http://smf.is/1y7z2u (from Rand Fishkin, and Martijn Verstrepen)
Posted by Slingshot SEO
Subdomains have often been the bane of many SEO-conscious organizations, but an easy solution might be right under your nose.
By using subfolders in place of subdomains, you can unite your content under one domain. While this may seem difficult to do when two sites exist…
One of our favorite web browsers just got a cool new tool in the form of Stylebot, a new Chrome extension that allows you to access and modify the CSS for any web page from within the browser.
That’s right — users get a completely customized design experience for any page they choose. The changes they make can be saved for later use and synced across multiple devices.
This is great news for you design enthusiasts as well as for end users with specific needs and wants for their browsing experience. For example, the extension makes web pages with small fonts more accessible by allowing users to increase the font size, and it can make browsing the web less commercial by removing ads.
Stylebot generates a sidebar full of basic and advanced CSS options that allow the end user to manipulate how content is displayed. This tool is simple enough to be used by a moderately competent consumer, but it also has options better suited for those with web design skills. Stylebot can be used to change font attributes, remove advertising, move page elements, change colors, write one’s own CSS selectors and quite a bit more.
Googler Rachel Shearer wrote the following today on the company’s blog:
“For example, a Stylebot user with special reading needs might change a webpage by removing images, picking new text and background colors, and even moving blocks of text around. And Stylebot saves the custom style they create, so the next time they access that page the changes will still be there. Even better, they can sync their saved styles across computers so that webpage will always appear with their preferred style.”
Check out this brief demo video to see Stylebot in action:
Stylebot was created as a Google Summer of Code project by Ankit Ahuja, a computer science student in New Delhi, India. Stylebot is open source and forkable; interested parties can check out Ahuja’s source on GitHub. He said he used elements of other open-source projects, such as Aristo and Firebug, in his work.
What do you think of Stylebot so far? Would you use it to prettify the ugliness that is Craigslist, for example, or to simplify content viewing on a news site?
For more Dev & Design coverage:
Posted by Dana Lookadoo
This post was originally in YOUmoz, and was promoted to the main blog because it provides great value and interest to our community. The author's views are entirely his or her own and may not reflect the views of SEOmoz, Inc.
How do I recap the SEOmoz PRO Seminar session on Uncovering a Hidden Technique for SEO? The title is so attractive that it produces Pavlonian symptoms as we salivate at the thought of uncovering a hidden SEO treasure. Ben Hendrickson of SEOmoz presented a model which appears to show how Google may assigning relevance to keyword terms based on context - topical relevance.
Is Latent Dirichlet Allocation (LDA) that hidden jackpot?
1st - LDA is not new nor something SEOmoz invented. The Information Retrieval model has been around for 7 or 8 years, and IR geeks have talked about it before. There are a number of resources, as well as nay saying, about LDA and Google's possible use of it.
2nd - What is new is SEOmoz's LDA Topics Tool that produces a relevancy score based off a query (search term). It enables one to play with words that may increase a page's relevancy in the eyes of Google. It shows words that help Google determine how relevant the page is to a user's search query.
Kyle Stone tweeted that the LDA tool is a game changer, and many retweeted.
Is SEOmoz's LDA tool a game changer? That's yet to be seen. The goal is to report Ben's research as presented at the Mozinar and how a layman (myself) interprets such. Rand is going to do a follow-up post to explain more.
Why all the hype?
The SEO Challenge
SEOs face the continual challenge of figuring out Google's hidden ranking algorithms. How do we rank higher? Which signals are the most important? We know search engines are "learning models" that attempt to understand "context” of words. Google has said for years that webmasters should concentrate most on providing good relevant (contextual) content.
There are ways to rank higher. Is it as easy as 1, 2, 3?
- Create quality copy with keyword(s) on the page along with associated anchor text links.
- Get good links.
- What Ben talked about in this session.
LDA - Topic Modeling & Analysis
Latent Dirichlet Allocation, in layman's terms, translates to "topic modeling." In search geek terms, LDA is the following formula:
(Did you digest that? Don't worry; Mozzers groaned and laughed at the same time. PLUS: Scientist Hendrickson delivered this session after lunch!)
LDA Simplified - Here is Ben's way of explaining topic modeling:
(Okay, I was once proud that I got an A in Logic and Combinatorics - discrete math/set theory. However, that computer science class now feels like basic math compared to this formula.)
It made more sense when Rand Fishkin joined Ben on stage and when Todd Freisen moderated and deciphered during Q&A. (Manuela Sanches of Brazil was sitting next to me and said that Ben's "presentation needed subtitles!")
The objective of LDA, from my deciphering of Greek, is to understand how Google is using semantic contextual analysis combined with other signals, to define topics/concepts. It's how Google analyzes the words on a page to determine the "set" to which a word belongs - how relevant a search query is to pages in its database.
For example: How does Google assign relevance to the word "orange" on a page? They determine orange is related to the fruit set or to the color set by page context.
"Latent Dirichlet Allocation (Blei et al, 2003) is a powerful learning algorithm for automatically and jointly clustering words into "topics" and documents into mixtures of topics. It has been successfully applied to model change in scientific fields over time (Griffiths and Steyver, 2004; Hall, et al. 2008).
A topic model is, roughly, a hierarchical Bayesian model that associates with each document a probability distribution over "topics", which are in turn distributions over words."
Bayesian - ah, a term I recognize!! Bayesian spam filtering is a method used to detect spam. It draws off a database and learns the meaning of words. It's "trained" by us when we mark an email as spam. It looks at incoming emails and calculates the probability that the content of an email is contextually spammy.
I found a PowerPoint presentation about Bayesian Inference Techniques by Microsoft Research from 2004 that presents the possibility of using LDA. Go to slide 54 and read:
"Can we build a general-purpose inference engine which automates these procedures?"
Microsoft has been looking at LDA models. Do search engines use it as one of their primary methods?
Ben sampled over 8 million documents with approx. 1,000 queries. He believes Google is using LDA topic modeling to determine (learn) what words mean by their associations with, relevance to, other words on the page. (Other factors are included.) Ben called the results a "co-occurrence explanation" that use a "cosine similarity."
- Results that are higher in Google SERPs, in general, have more topical content.
- Search engines do APPEAR to apply semantic analysisÂ… when indexing a page and determining the intent of the words on the page.
Rand tweeted an explanation (in 140 x 4) as follows:
Dana's LDA Catwalk Metaphor for Topic Modeling:
Imagine the words on your page as walking down the fashion runway in Paris. Your keyword phrase is "dressed" in semantic accessories, words that correlate to and dress up your topic. Associated words bring meaning to and highlight the fashion model's outfit. Adjectives, modifiers and synonyms are like jewelry, hats, and shoes. The combination can transform your base layers (your target terms) from casual or conservative business attire into a sexy night-on-the-town ensemble.
Combinations and permutations of words on a page "dress" your skinny or curvy fashion model. Relevant words provide Google with an image of what she is wearing and the catwalk upon which she struts. LDA refers back to what Google already knows about these "accessories" (words) and their previous association with the topic terms related to fashion.
Enter Topical Ambiguity - I just broke the "rules" for context with the catwalk metaphor by referring to modeling in two contexts on this page:
- I used "modeling" terms that relate to the "fashion industry" set.
- The catwalk metaphor is irrelevant content that is off-topic for discussing "LDA topic modeling."
Google Algorithm Exposed?
Ben clearly said that LDA is an ATTEMPT to explain the SERPs. His scenario, a quote from his presentation slides, follows:
One of us needs to implement it so we can:
1) See how it applies to pages
2) See if it helps explain SERPs
LDA is not LSI.
There were some tweets claiming SEOmoz was bringing back LSI or snakeoil. Ben clarified that LDA is not LSI, which deals more with keyword density. He explained that he is NOT talking about loading keywords on a page but about the relevance of the topics within the page. He said that:
"LSI doesn’t have the same bias toward simple explanations. LSI breaks down as you try to scale up the number of topics."
The LDA tool deals with context, semantic relevancy, not density - in addition to some other random factors. Example:
If SEOmoz has a page all about "SEO" and "tools," and there is another word on the page that can be explained by a word that is more related to SEO topic, then the related word would be used. Meaning, "seo tools" doesn't have to be repeated over and over, and the related word would be interpreted by Google as being relevant.
Ben, who appears to have the brain of a search engine, noted that it "appears" LDA is what Google is heading for in the near future. He said (paraphrased):
If they are not doing it, they seem to be doing something that has the same output. They are probably already using it.
It’s a super weird coincidence if Google is not using it.
Are On-Page Signals Stronger than Links?
Are we heading toward more emphasis of on-page topic modeling? I'm not an IR geek, but I do plan to spend more energy focusing on understanding how search engines retrieve informaton. We are dealing with a semantic Web. LDA may indicate that good old on-page optimization sends stronger signals than links.
SEOmoz's LDA tool attempts to show how relevant content is to a chosen keyword. It computes relevance of queries.
The score at the top is an indicator of how relevant the content on that page is according to LDA.
- Aaron's content is 72%* relevant for the query "seo tools."
- SEOmoz's tools page is 40%* relevant.
*NOTE: (I inserted the logos.) You can run the same pages and get different results. The results are similar in that SEO Book always scored as more topically relevant, but the percentage varies. Is this the random Monte Carlo algorithm at work? Ben?
"How do we execute this for SEO?"
"I don't actually do SEO. I write code."
That's up to us, the SEOs, to play and test in our Google playground.
Use the tool to decide if you can win with LDA to optimize your on-page signals.
- Use the LDA Topics Tool to return words that could be used on a page for a query.
- Then determine who is ranking for that term.
- Simply write content that is highly on-topic based off the findings you observe.
If you are not performing that well in the SERPs, think about classic on-page optimization. In the example above, rather than putting another instance of "seo tools" on the page, LDA shows there are better ways to tell Google that you are about that topic. The tool provides a way to measure that.
IMPORTANT: There is a threshold at which too many related words will appear as too spammy. LDA is not something to be used to game Google.
Test the LDA Tool out for yourself, and draw your own conclusions.
DISCLAIMER: I'm not claiming this methodology has uncovered hidden SEO treasures. Time, testing and playing around with a new SEOmoz tool while observing the SERPs will reveal the answer. In the meantime, I'm going to dress up my pages and accessorize them with relevant terms that make them dazzle so they look good climbing the Google catwalk.
“404 Not Found.” These three little words can make any Internet explorer an unhappy camper. After all, who hopes to click on a broken link or stumble upon a moved or deleted page while cruising around the web?
Luckily, some web designers have chosen to end the misery of encountering a 404 error page. Instead of letting their site readers bump heads with a nasty dead-end error message, they’ve managed to squeeze a little entertainment out of it.
Below you’ll find some of the most entertaining 404 error pages on the web. We’ve listed them alphabetically to avoid playing favorites, but they’re all worth a look. Share your favorite 404 error page designs in the comments below!
1. 501st Legion
501st Legion is a Star Wars costume organization. It only makes sense that its 404 page would play on Obiwan's famous jedi mind trick with a "weak-minded" stormtrooper.
Yes, Houston, a 404 is definitely a problem.
A nice ol' chap comes to your assistance on ApartmentHomeLiving.com if you run across a 404 error page. Click the lovely lady peering from behind the frame for proper navigation suggestions.
"Uh-Oh! SpaghettiOs!" You know you're a part of pop culture if your jingle makes it into a 404 error page. The popular SpaghettiOs marketing jingle is here to stay.
Head over to Blippy's 404 page for an adventure. Keep clicking on the boy dressed in a unicorn outfit to discover a triple rainbow! Who knew a 404 page could be so fun?
6. Factor D
In an ode to early horror films, Factor D features an appropriately horrific 404 error accompanied by a beautiful, yet terrified scream queen.
Many 404 error pages apologize for the error. Not this one. It's obviously your fault.
8. Blue Daniel
This 404 error page is a beautiful depiction of "Track 404," a fictitious NYC subway line. Check it out to experience the full animation.
"You can click anywhere else, but you can't click here." Love it.
Bottom line: You can't go wrong with cute kittens.
11. Chris Jennings
Most of us would like to run into a 404 error page just as much as we'd like to run into the Grim Reaper.
When you're facing "sharks with laser beams attached to their frickin' heads," what do you expect? Definitely a 404 error.
Well, that can't be good.
Picking a 404 error page design that is consistent with your name is an appropriate move. CSSscoop chose a melting ice cream cone, with a scoop of ice cream, of course.
Sarcasm in dire situations is always appreciated, right?
Try not to make any missteps on GOG, or you'll end up lost in the cosmos.
Insulting your readers doesn't usually help, but this 404 made me chuckle.
Hoppermagic chose to stick to its brand imagery when creating its 404 page.
There's just something about a really frustrated baby that catches your attention. And if you've made it to this 404 page, you probably feel his pain.
20. Itchy Robot
If you can't think of something clever to say on your 404 page, just write exactly what your users are thinking.
Jackfig added a creative touch to its 404 error page, with an inspirational haiku.
22. Jolie Poupée
Jolie Poupée, creator of eco-friendly children's clothes, serves up an audience-appropriate 404 on its site.
23. Mark Dijkstra
This 404 error page is reminiscent of the kitschy tourist shirts that your lousy friends and relatives buy you when they visit amazing places.
Prithee, go medieval on your site's visitors if need be.
Imagery always makes a 404 more entertaining.
Some 404 error pages do a wonderful job of explaining to users exactly what caused the 404 error. OrangeCoat provides a fun decision tree for lost web surfers that is sure to help them along their merry ways.
Have fun with colors, shapes and exclamations.
28. Sick Designer
Sick Designer captures the depression that a 404 can cause on its error page.
29. Student Market
How fitting that a student-centric site would feature an addition problem on its 404 page.
This page just pops. We like it.
31. The North Face
For true entertainment value, why not just tell a story? The North Face does just that by creating a tale about link-eating mountain goats.
You might have been pwned, burned, punk'd or rickrolled recently, but have you been 404'd? Click here to join the party.
33. TK Designs
Excitement! Adventure! Ahhhhh, where I am?!
34. Urban Outfitters
Our sentiments exactly.
35. Urban Pill
If, after searching for hours, you still can't find the page you were seeking, Chuck Norris probably has it.
More Web Design Resources from Mashable:
- 12 Beginner Tutorials for Getting Started With Photoshop
- Use Adobe Fonts in Your Own Web Designs
- 10 Essential Free E-Books for Web Designers
- 12 Beginner Tutorials for Getting Started with Adobe Illustrator
- 6 New Mac Apps for Designers and Developers
For more Dev & Design coverage:
- Follow Mashable Dev & Design on Twitter
- Become a Fan on Facebook
- Subscribe to the Dev & Design channel
- Download our free apps for iPhone and iPad
Read | Permalink | Email this | Comments
[Note: While most videos on Vimeo.com would play back on iDevices prior to this update if watched on the Vimeo site, the new 'Universal Player' embed code should allow publishers and website developers to include compatible embeds on their sites. Vimeo does not provide mobile versions of every video on the site, limiting some features to Plus (paid) users. Details on making videos mobile-friendly are in the Vimeo FAQ and the new features announcement. -Ed.]
Popular video site Vimeo (think a more artsy YouTube) has changed its embed code to be completely HTML5 compatible, which means you can now
browse the site completely embed videos that play back on the iPhone or the iPad. I just pulled the site up on my iPhone, and I have to say, I think it's a smoother browsing experience than the browser itself -- you just get a list of videos, and clicking on whichever one you want (like, for example, the great Dennis Liu music video above) opens it right up in Quicktime. Good deal. Vimeo's been flirting with HTML5 for the better part of this year, but this switchover means everything (including embeds when seen from an iPhone or iPad) is available in HTML5 from the start -- bye bye Flash . [Flash will still be served to desktop browsers. -Ed.]
The new update also adds a "Watch Later" feature to accounts on the site, so you can save videos and pull them back up on the device of your choice, even if you're not using something that works well. Eventually, the Watch Later feature will be added in to the Vimeo API, and there's also a new Vimeo channel available on Roku set-top boxes if you've got one of those.
But the HTML5 change is the biggest one -- one more site leaves the Flash-only fold and becomes extremely accessible to Apple's platform.
Posted by randfish
How many presentations do you see that show traffic stats like these?
These charts aren't wrong, per se. They're not lying to you, but they are obscuring the truth, and they're making it impossible to know what's going right and wrong.
The problem isn't that the numbers are inaccurate, it's that no website is just ONE SITE. A website is a collection of pages, and oftentimes, a collection of lots of different KINDS of pages. Even the simplest of sites, built on blog CMS' like Wordpress or basic CMS' like Drupal have unique sections within them - the homepage, individual posts, static pages (about, contact, et al.), categories, search pages, posts by month, author, etc. - all of these have different formats, different functions and, almost certainly, different visitor stats.
Yet, for some reason, when we as marketers look at a site, we don't ask "how are the category pages doing this month?" or "how is the blog performing compared to the white paper articles?" We ask, "how's the site doing."
The singular answer to that question often obscures a more nuanced, but valuable truth: Different website sections perform differently.
If your car starts having trouble accelerating up hills, you don't blame the entire car for the subpar performance, you start to examine potential causes (electrical system, engine, tires, etc.) and break these components down until you find the cause. Likewise, with a website, every piece should be performance tested, tuned and monitored on a regular basis.
Don't do this:
The total page views data is fine as an overview, but we need to monitor each individual section to really understand what's gaining vs. falling.
By segmenting out traffic to URLs that include */blog/* and those that include */ugc/* (YOUmoz), we can see when/where/how each section is rising or falling in traffic and contributing to the overall site's performance.
Even better, we should do this:
How did I make that chart?
Step 1: Separate the areas of your website by the words/characters in their URL string (or other identifying factors like keywords in their titles). For example, on SEOmoz, we've got:
- The Blog - all URLs include /blog
- YOUmoz - all URLs have /UGC
- Guides - nearly all have /articles
- Tools - most URLs are different, but there's only around 20 so I can lump them together
Once I have these segments, I'll use the URL structures to get data about pageviews (or any other metric I care about) separately through analytics.
Step 2: Use the content filter in Google Analytics to select only those pages that contain the URL string you're seeking:
By using the simple filter for URLs "containing" /article, I've got a segmented report I can now use to start seeing what's really happening on my site.
pretty simple, right?
Step 3: Filter on each report and grab out the relevant pageviews number on a weekly basis:
I grab those numbers for each of the segments each week (well, actually, Joanna does - but she says it's less than an hour of work) and plug them into a spreadsheet.
Step 4: Create a spreadsheet and a stacked graph
This spreadsheet shows the number of pageviews to each section of the site
When you run these over long periods of time, you can really see the impact a new section is having, or where problems in traffic might exist. If you neglect to break things out in this fashion, you'll often find that traffic from one section's gain may overshadow the loss in another area. This over/under-compensation can hide the real issues for a site, especially in SEO (where indexation, rankings and keyword demand all play inter-connected roles).
Joanna, in her post on benchmarking, shared this chart:
Also see this larger, detailed version
This helped us to realize where things had gone awry and why (the problem stemmed from some poorly done redirects from Linkscape to Open Site Explorer). I can't recommend this practice enough - if more marketers managed their analytics in this fashion, we'd have a much easier time identifying potential problems, opportunities and understanding not just the quantity of traffic, but the "whys" behind it.
Anyone with some clever Google Analytics methodologies to build these faster/more efficiently than my Excel hack, please do share!
UPDATE: Some friends from Maki Car Rental put together a stacked pageviews PHP code that pulls from the Google Analytics API here. Thanks!
Managing one’s Twitter presence can be tricky. At over 146,000 followers at the time of this writing, I’ve got to do things to manage at scale. I wanted to share with you my thinking, as I’m asked quite often, “How can you follow over 100,000 people?” This post should help you understand the ins and outs of my Twitter Presence.
(Brought to you by the Genesis WordPress Theme – affiliate link.)
My Twitter Presence
First – My Accounts
@chrisbrogan – is my main account. It’s my home base.
@cbreplies – is how I’ll reply to people from now on (most times).
@broganmedia – is a data feed of things I’ve shared plus my blog’s RSS feed.
Every other account that mentions my name in part or in whole is not mine, not run by me, not endorsed by me.
People always ask about the tools I use for Twitter. Here’s the list.
Seesmic Desktop – for my desktop.
Seesmic Mobile – for my phone.
SocialToo – for auto followbacks (NOT auto DMs), and spam mgmt.
Twitter Search – for search strings.
Listorious – for finding good Twitter lists.
oneforty – for all things Twitter (hi, Laura!).
My Twitter Goals
I have more than one goal in mind when using Twitter. In fact, I have several. They may be different than your goals. Your goals are also okay. (Oh, and you’re doing it wrong.)
- I use Twitter to get the pulse of people in the larger online world.
- I use Twitter to communicate in two directions.
- I use Twitter to promote important causes, as well as business opportunities.
- I use Twitter to promote other people’s stuff 12 times as much I as do mine (12:1 rule).
- I use Twitter to stay updated on people’s shared news.
- I use Twitter as a quick pulse-taking service.
- I use Twitter to find business (via search).
- I use Twitter to stream links to my stuff and to others’ stuff.
- I use Twitter to connect with humans.
Again, there are lots of ways to use it. Your way is just wonderful. My way is just mine.
My Twitter Methods
- If you’re not using a multi-view client (like Seesmic, or HootSuite, or Tweetdeck), you’re not going to see it all.
- I use search more than any other feature and follow many search strings.
- I scan my @replies and a search on “brogan” to make sure I don’t miss replying to you.
- I have a column open to scan for Trust Agents. I said we would.
- I use lists to make sure I see as many people as I can. I turn these on and off.
- I probably didn’t see your tweet. At over 100,000 people, the Twitter API can’t even send me them all. Software/pipe issue.
- I follow back so that you can DM me. (I unfollow you if you spam me.)
- I follow anyone who follows me (with some exceptions). I unfollow bad/annoying people as I find them.
- I try to reply as often as I can. If I miss your @ message, I’m so sorry.
- Sometimes, I use Twitter to get answers. If you ever want to see what people said back to me, just click this.
- Twitter is not a way to reach me in a hurry. If you need me immediately, you probably already have my cell number.
- Twitter isn’t a great place to ask me serial questions. If you have many questions to be answered, feel free to contact me.
- Twitter is a great place for serendipity. Try that for yourself.
How do YOU use Twitter?
This was my methodology and mindset behind Twitter. Yours is no doubt different. It’d be fun to see your post on “My Twitter Presence.” Feel free to drop links to such posts in the comments section (note that I manually approve all html links, so bear with me). What say you?
Love It: Request files from anyone, even if they don’t have a Dropbox account.
Hate It: Off-site uploading could concern some users. Can only upload 1 file at a time.
Surely you’ve heard of Dropbox, but if you haven’t, here’s the quick and dirty version. Dropbox is a service that gives you a cloud-based storage box for storing, sharing or receiving files. Honestly, it’s probably one of the handiest applications that I have, and I use it daily.
AirDropper is a Dropbox utility that lets you request files from anyone, even if they don’t have a Dropbox account. All you have to do is fill in a form that sends an email to the person who has the file you need, then AirDropper handles the rest.
Your recipient then gets an email that looks like this:
Once they click the link, they’re taken to an upload page. They can upload the file you’re requesting, and then AirDropper will toss it right into a folder that you specified in your request.
When you have an already-great service like Dropbox, it’s hard to go wrong when you’re providing an application that simply makes it easier and more handy. AirDropper does just this, and is going into my short list of must-have web applications.
Original title and link for this post: AirDropper makes Dropbox more useful. Request files from anyone.
By now, if you are not on Twitter, you are quite behind the times. But what if you want to be ahead of the crowd, so far ahead that you actually leave Twitter? After all, the hipster mafia will only stick around something useful for so long before they drop it, should you beat them to the punch?
While we generally are drop dead enthusiastic about Twitter, and to be fair, we all live and breathe the service around here, are there five good reasons for leaving Twitter? Let’s find out.
No More Social Media Experts
Did you know that there are actual social media consultants who do useful things for corporations, and that indeed, I can actually name some people who are the dreaded ’social media expert?’ Yet, on Twitter, everyone is an expert, everyone is an innovative Twitter deity. Except that nearly no one who claims to be one actually is. Everyone that I know who does serious business helping people learn social media is dead quiet about it.
If you have to ask, you can’t afford it. If you have to to claim expert status, you are no boss. If you left Twitter, these twits would be gone, gone, gone. If you filter Twitter very well on your own, you get to a pretty clean state, but these punks are a great group to dodge, and leaving Twitter is a 100% way to do so.
Meme-Free And Proud
Two years ago, before you were on Twitter, did you give a care what ‘#FF’ could mean? Of course not! Twitter is bringing memes to the masses in a big way, and it is ugly. Head over to Twitter.com and take a look at what is on the trending topics section. Right now there is a wonderful entry called ‘#therewasatimewhen,’ something that is easy to parse at first glance, not always the case with the top trending memes.
What does this mean? That you can take part in a global discussion with a bunch of yahoos (not the company) about some things that you may or may not find interesting, and will not be correctly spelled. Or you could not, and instead could take that time and do something useful with it, like popping that zit you couldn’t get last night. Just a thought.
You Are A Real Person Again
Once you put Twitter out of your life, you will never show up to a bar and then fastidiously tweet its condition, the type of drink you purchased, how it tastes, and what the creepy dude in the corner is doing. Oh, and then you have to answer all the @s you get from that exercise. By that time, your drink is gone, and you are headed out the door, being so social, and yet ignoring anyone who doesn’t have an avatar.
You could have just had a drink, talked to someone new for a minute, had a quiet moment to yourself, and been a real dang person. Let’s face it, we all tweet too much.
Never Know Where Anyone Is
Once you leave Twitter, and I can only suspect as I could no more leave Twitter than cut off my leg, you will never know where you friends are. Think about it, how often do you really use Foursquare to find friends? More likely, you merely find the occasional tweet from a person you follower from Foursquare, and if you are nearby, invite them out for a pint.
Without Twitter, you get to do your own thing, never worrying about who is where, and if you might be seen going somewhere that is not cool enough for Twitter. Freedom, you have to quit to get it.
Never Feel Obligated To Respond
This doesn’t apply to everyone, as most people are already abysmal at replying to their @s, but once you do cut the Twitter cord, you never have to answer another one again. Even better, you can never be asked if you recall “That exchange we had on Twitter last summer!” Oh hell no you didn’t. I’ve sent out 10,000 tweets since last summer, and half of them are responses. Unless you are Denzel Washington, I did not file our conversation away for later recall.
Well, that is my list. It’s not fully complete, I am sure that we could brainstorm something longer, but you have to admit, there are some real pluses to leaving the ol’ Twitter. Now if you will excuse me, I need to go see how many @s I received while writing this post.
Original title and link for this post: 5 Reasons You Should Quit Twitter Right Now
Voor een aantal medische en financiele beroepen is het vereist bergen met data te verwerken, maar dat allemaal samen komt nog niet in de buurt van de hoeveelheid data die is opgenomen door het Internet. Google schat dat het internet zo'n 5......
Een trucje: vervang iedere keer als je het woord ‘Twitter’ tegenkomt, dat woordje door ‘Vereniging van Aquariumliefhebbers’. Volgens een schatting van internetbedrijf Twirus, dat Twitterstatistieken bijhoudt, zijn er nu 191.000 actieve Twitteraars in Nederland. Dat zijn minder mensen dan Nederland aan aquariumliefhebbers telt, schrijft journalist Arjen van Veelen vandaag op de opiniepagina’s van nrc.next.
Van Veelen: “Dus als een Radio 1-presentator meldt: ‘we zitten ook op Twitter!’, dan hoor je: ‘En we zijn ook lid van de Vereniging van Aquariumliefhebbers!’ En als er eens een relletje is onder twitteraars, lees dan: ‘Ophef en vertier onder aquariumliefhebbers!’”
Twitter wordt al sinds de oprichting overschat, stelt Van Veelen. “Als iets nieuws is op Twitter, is het tegenwoordig ‘dus’ ook nieuws in De Telegraaf, de Volkskrant, het NOS Journaal, enzovoorts. Aan deze site wordt buitensporig veel gewicht toegekend. Dat zorgt voor hypes, hoaxes en ruis. Iemand zegt ‘boe!’, een ander reageert met ‘bah!’ en voor je het weet zitten we met de Kamervragen.”
Kern van het probleem is een aantal misverstanden over Twitter, denkt Van Veelen. Hij debunked zeven “hardnekkige Twittermythes”. Hieronder de korte versie:
1. Twitter is groot. Bijna 99 procent van de bevolking twittert niet. En de mensen die twitteren, zijn niet representatief voor de bevolking: politici, marketeers en ict’ers zijn oververtegenwoordigd. Zo bezien is Twitter slechts een hip intranet voor de incrowd, een virtueel grachtengordeltje.
2. Trending Topics zijn belangrijk. ‘#Carglass zuigt’ was vorige week trending topic op Twitter, wereldwijd. Maar wat betekent dat? Volgens Nico Schoonderwoerd (van Twirus) is een onderwerp al wereldwijd het meest besproken (‘trending’) met twee berichtjes per seconde.
3. Twitter is een zegen voor de democratie. Aanvankelijk vond ik het een grappig idee om op tv naar een belangrijk Kamerdebat te kijken, bijvoorbeeld over Uruzgan, en dan tegelijkertijd op mijn telefoon de berichtjes te volgen van @femkehalsema, @borisham, @diederiksamsom. Tot ik dacht: gek eigenlijk. Deze mensen moeten scherp opletten, maar zitten nu met mij te chatten.
4. Twitter is een zegen voor de journalistiek. Stel dat nrc.next boven de nieuwspagina schrijft: ‘Lezer, zoek zelf maar uit welk berichtje klopt’. Dat zou gek zijn. Maar een Twitter-feed is precies zo’n nieuwspagina, alleen dan zonder die disclaimer. Een typische Twitter-news flash volgt dit patroon: Er is iets gebeurd!!! O, nee, toch niet!!! Of wacht… nee, laat maar… Doorlopen mensen, niks aan de hand!!! (Hoe vaak twittert er niet iemand: ‘OMG! Er rijden as we speak tien politieauto’s mijn straat in!’. Of: ‘Er cirkelt al de hele dag een helikopter boven mijn hoofd!’)
5. Snelheid is altijd goed. De de prijs die je betaalt voor één keer per jaar echt breaking news: een doorlopend abonnement op ruis. En zelfs als het nieuws wel klopt – wat heb je er eigenlijk aan als je drie minuten eerder weet dat er bij Schiphol een vliegtuig is gecrasht?
6. Twitter is zelfreinigend. Hihi. Dat is hetzelfde als van kikkers verwachten dat ze vanzelf terug in de kruiwagen springen. Of check jij een bericht uitentreuren voordat je het retweet? En als blijkt dat het een hoax was, vraag je dan aan iedereen die het bericht heeft overgenomen: ‘Eh, sorry, mag ik mijn tweet alsjeblieft terug, er klopte iets niet’?
7. Het is allemaal de schuld van Twitter. Die 191.000 gebruikers moeten vooral gewoon blijven twitteren: Twitter is een geinige, snelle en soms nuttige website. Onschuldig, zolang politici, media en officieren van justitie het niet groter maken dan het is.
En nu ben ík dus benieuwd wat leden van de Vereniging van Aquariumliefhebbers van dit artikel vinden.
Posted by randfish
For a long time, if you asked me about spamming the search engines, whether with hardcore black hat tactics or merely gray hat link acquisition, I'd say that in the long run, neither was the right move. Building a great site and a great brand through hard work, white hat links, solid content and marketing strategies has always been my path of choice. It still is today, but my faith is definitely wavering.
In the last 12 months, I've seen (or, at least, felt) less progress from Google's webspam team than in any previous year I can remember. Popular paid link services that Google's search quality folks are clearly aware of have worked for months on end (some have done so for years). Crummy, low quality directories and link exchanges have made a comeback since the big shutdowns in 2007-8. Even off-topic link exchanges, which experienced their own blowback in 2006-2007 have started working again. Horrifyingly bad sites are ranking atop the results using little more than exact match domain names and a few poor quality links. There's even a return of the link farms of the early 2000s, with operators creating (or buying old domains and converting them into) junky, one-page sites to boost their own link popularity.
On nearly every commercially lucrative search results I pull up these days, I see bad links pushing bad sites into the top rankings at Google.
Examples of Web Spam in the Rankings
I made a promise to Aaron that I wouldn't "out" spam, and although I still don't believe it's the wrong thing to do morally (it hurts everyone's search/web experience, why should SEOs band together to protect it?), I do want to keep that promise. So, while I can't point you to any particular links or sites, here's a good set of queries where plenty of link manipulation is keeping a few, some or many of the top (5-10) ranking sites in those positions:
- SEO Software
- Starcraft 2 Strategies
- Birthday Party Supplies
- Currency Trading Online
- Tennis Racquet Reviews
- Leather Crafting Supplies
- Nanny Services
- Home Business Ideas
- French Doors
- Vietnam Tours
- Antioxidant Supplements
- Home Espresso Machine Ratings
Just run a few OSE reports on some sites that rank well here and you'll see what I mean. There are numerous players in these listings who don't have a single natural or editorially endorsed link. And you don't need to limit yourself to these queries either.
3 Steps to Find Lots of Link Manipulation
Step #1: Search for "SEO Friendly Directory" and visit a few of the sections included in the resulting sites that come up.
Step #2: Search for the primary keywords the directory-listed sites are targeting in their title tags or the anchor text they've gotten from the directories.
Step #3: Check out the top 5-10 listings in the rankings and you'll find an abundance of sites with few to no "natural" links whatsoever
Why is Google Letting So Much Spam/Manipulation Go Unpenalized?
I don't know. But, I do have some guesses:
- Scalability of Spam Fighting Tactics - it could be that the ability for Google's team to combat web spam has diminished due to the increasing size, complexity and demand in search. Perhaps fighting spam is a much tougher problem in the 100s of billions of pages than it was in the 10s.
- They're Working on Something Big - for many years, Google would let lots of spam they clearly knew about pass... for a while. Then, they'd release an algorithmic update to defeat a huge layer of spam or seriously cripple certain types of link manipulation. If that's the case today, this would be one of the longest times between updates we've seen (MayDay had a small impact, but it wasn't link-manipulation targeted from everything I've seen).
- Too Much Baby Thrown Out with the Bathwater - perhaps, as link manipulation and spam have grown in popularity, Google's found that they can't penalize a technique or sites employing it without dramatically reducing the usefulness of their index (because so many "good," "relevant" sites/pages do some dirty stuff, too). If this is the case, they'll need to work on much more subtle, targeted detection and elimination systems, and these might be substantially harder to employ.
- WebSpam Team Brain Drain - The spam fighting team put together by Matt Cutts from 2001-2006 was Google's cream of the crop. He personally hand-selected engineers from search quality (and other departments) to combat the black hat menaces of Google's early growth days. SEOs could frequently interact with many of these crazy smart folks, from Brian White to Aaron D'Souza to Evan Roseman and many more. That interaction today is largely limited to the webmaster tools team, which may be an appropriate PR move, but it's hard to know whether the new team is up to the task. We do have one new, semi-publicly contributing webspam team member, Moultano, on Hacker News (you can see all the threads he/she has participated in on the spam topic with this query).
Matt himself is finally taking a well deserved break, but even at home he's much less public on the web, much less active on webspam topics on his blog, visits fewer conferences and now invests in startups, too (which surely takes up time). I don't mean to criticize Matt in any way - if I were him, I'd have left Google long ago (and he's clearly put in more than his dues), but the possibility remains that the team he built is no longer intact, or no longer of the quality it was in the early years.
- Live and Let Live - It could be that although Google's public messaging about webspam and link manipulation hasn't changed, internally their attitude has. Perhaps they've found that sites/pages that buy links or run low quality link farms aren't much worse than those who don't and having relevant results, even if they've used black/gray hat tactics, isn't highly detrimental to search quality. Certainly in some of the examples above, that's the case, while in others it's less true. I recall that years ago, the MSN Search team noted that they'd much rather fight poor quality results in the index than fight high quality results who happened to buy links. Maybe Google's come around to the same philosophy.
- They're Counting on New Inputs to Help - Part of Google's initiative in acquiring social gaming companies, building social platforms and making data deals with folks like Twitter could be to help combat spam. They may have hopes that leveraging these new, less polluted (or, at least, more easily trackable) forms of recommendation/citation can be a big win for webspam and search quality.
Why Rant About Spam?
"Blah. Blah Blah. So what if Google's not doing as much to stop spam as they have in years past?" I hear you ask.
My concern is primarily around the experience of searchers and what it might mean if results become polluted not just by good or relatively good sites that happen to buy or manipulate links, but by really bad crap - the sort that makes searchers want to find a new way of getting information on the web (Facebook Q+A? Twitter? Yelp?). Search today is an amazing marketplace of web builders, marketers, suppliers and customers. If the last of these - the customer - slowly becomes disenchanted with Google, the world of search marketing and the amazing utility of search in general may come to an end.
If you use search engines or work in search marketing, that should be the last thing you want.
That said, if you believe that most of the "spam" will eventually be beaten out by either legitimate results or by better sites that also spam/manipulate links, then there's much less to worry about (I'm not fully in either camp and can see both sides).
So, What Should Legimitate Marketers Do?
Please DO NOT go out and spam the results, buy links, submit to crap directories and open up link farms. Even with this current trend, I believe that would be terrible advice. Plenty of sites do get caught and filtered, and I'd rather know that my site was safe and every piece of content I added and link I built would help bring more traffic than constantly worry about the small but real risk of being penalized or banned.
One thing Google has done is continue to make the experience of penalization a horrific one. It's hard to know if you really have a penalty, nearly impossible to figure out what triggered it and onerous, almost Kafka-esque, to attempt to get back into their good graces. If you can live with that risk, as professional black hats do with their churn-and-burn strategies, then it's less of a concern. But if you're building a real business, Google is still driving 70%+ of the searches on the web in the US (and 90%+ in many other geographies), and it would be foolish to take such a terrific risk.
As to the question of reporting the spam of your competitors - that's up to you. However, Google has certainly made it a less likely, less rewarding activity. Nearly every day, we answer PRO Q+A related to the question of link manipulators outranking legitimate marketers and sites, and I can recall only once in the hundreds of questions I've answered in the last few years when a spam report actually led to action (to be fair, I don't follow up consistently on every one, but many of our PRO members will send a regular ping with updates).
What we can do is to re-double our efforts to build great sites with amazing value for people. No matter what the "search" experience of the future is like, those sites and pages that provide a remarkable experience are sure to surface near the top and receive the added benefit of word-of-mouth praise, viral spread and citation in whatever forms it may evolve to, both online and off.
Some Caveats to My Experience
There are millions of queries that are remarkably spam free and Google has done a consistently exception job fighting spam over the years. However, the recent past has me concerned that they are no longer as interested, diligent or capable of combatting even the most basic spam techniques.
It's also certainly the case that I'm regularly exposed to many queries and topics that SEOs, both black hat and white, focus on, and thus might see more spam than the average searcher (though anecdotally I'd guess they're seeing more, too).
What Do You Think?
Have you been seeing more results in the rankings that are performing well despite having virtually no "natural" links? Have you seen Google take action on spam reports? Why do you think the recent past has many fewer examples of big spam-cleaning updates?
I'm looking forward to some great discussion - and this week I'll be at SES San Francisco (on 5 different panels!) - feel free to grab me and chat privately there, too!
p.s. With regards to Bing, the only other major US search engine now that they're powering Yahoo! (or on the verge), my opinion is that they have been making substantive strides. They're still behind Google in many areas (and ahead in a few), but at the current rate, we might actually see Bing surpass Google's spam detection and filtering in the next 18-24 months, though they will probably still be playing catch up in long tail relevancy/quality.
Vimeo is releasing a “universal player” today that allows user to watch embedded Vimeo videos on mobile devices including the iPhone and iPad using the video playback capability built into the new HTML5 standard.
Vimeo will deliver the optimal player — Flash, HMTL5 or native — based on a user’s browser, as well as the appropriate video definition (HD, SD, mobile) and compression standard (H.264 or WebM, an open format developed for use with HTML5).
Vimeo has been serving HTML5 video for iOS devices for a while now, when you watch them at vimeo.com. What’s new is that they’re now doing this for Vimeo videos embedded on other sites as well. Update: Looks like YouTube is testing something similar.
Google is making changes that require agencies to be more "transparent" in their reports to advertisers. But will this change benefit advertisers—or is it a move by Google to build its own brand and assure customer loyalty? The answers aren't entirely... "transparent."
*** Read the full post by clicking on the headline above ***
But there is still one big downside: third-party apps. While the iPhone boasts 225,000 of these downloadable programs, and Android claims 70,000, the BlackBerry platform is still stuck at a measly 9,000.
I know space is limited in newspaper reviews, but this “how many apps are in the respective app stores” metric is being given too much weight — not just by Mossberg, either. I’ve said this before, but by this metric, we’d all be using Windows, not the Mac. Which platform has the most apps is interesting, but which platform has the best apps is more important. I say the answer to both questions is iOS, but what if Android gets to 300,000 or 400,000 apps or whatever before iOS does? Would that make Android better?
Put another way: is it a bigger problem that RIM’s App World has only 9,000 apps, or, that the typical quality and polish of their apps is beneath that of the apps in Apple’s App Store? A simple app count is nice and comfortable because it’s not subjective (like my statement in the previous sentence about quality and polish), but it’s potentially misleading.
Posted by Matt Cutts, Search Quality Team
From what I’ve been reading on support forums, some users of Quicksilver saw no affect from upgrading their machines to Snow Leopard. I however, was not one of those people. And although I am warming more and more to Google Quick Search Box, I still supplement my usage of QSB with Quicksilver where the former is lacking in features. So I tinkered around until I was able to resuscitate and use Quicksilver again under Snow Leopard.
Been having similar problems? Let me walk you through the process.
First, make sure you’ve upgraded to the most current version of Quicksilver. B56a7 is the most up to date version, that is built for Snow Leopard. Download it here if you haven’t already done so. Once that’s done, go ahead and try running Quicksilver. If you’re lucky, that’s all that is needed. If you’re not, don’t fret, because we’ll figure it out in short order.
What I found to be the issue with getting my installation of Quicksilver working, was the huge collection of plug-ins that I’ve installed over the years. Plug-ins (in my opinion) are what have made Quicksilver what it is to so many of us today. They extend it into nearly every nook and cranny of your Mac, allowing you to control everything. The downside, is that some of those plug-ins are broken in OS 10.6’s new underlying architecture.
So let’s fix that.
- Navigate (using Finder) to ~/Library/Application Support/Quicksilver
- Open the ‘PlugIns’ folder
- Select all of the plug-in files, and drag them someplace else (I suggest the Desktop, for easy access, as we’ll be using them again)
Now, let’s make sure Quicksilver opens before we go any further. If at this point the answer is no, I’d recommend posting the question to the Blacktree Quicksilver group on Google. (I’m happy to help too of course, but I’m not that bright, and if I can’t duplicate your setup, there’s a slim chance I’ll be useful on the subject.) If on the other hand, Quicksilver is starting up, let’s continue.
- Quit Quicksilver
- Drag the first plug-in (that you moved somewhere, like your Desktop) back into that ~/Library/Application Support/Quicksilver/PlugIns folder
- Launch Quicksilver
Did Quicksilver remain open? Or did is crash after a few moments? If Quicksilver is still running, then that plug-in was not the culprit, and you may return to the ‘Quit Quicksilver’ step, and repeat with the next plug-in file. If Quicksilver bombed after launch, continue on below. (Does this feel like a choose your own adventure to anyone else?)
- In the ~/Library/Application Support/Quicksilver/PlugIns folder, delete the plug-in that you just placed there
Unfortunately, that plug-in was no longer valid under the new Snow Leopard architecture. At this point, you should rinse and repeat (so to speak), moving the next plug-in file from your Desktop, into the Quicksilver PlugIns folder, and then seeing if Quicksilver crashes or not. Do this until you’ve deleted the bad plug-ins, and have the rest installed, with Quicksilver running happily again.
As a point of reference, the plug-ins that appear to cause me trouble were:
- Airport Module
- Services Menu
- Text Manipulations (not 100 percent certain about this one)
- Image Manipulations (not 100 percent certain about this one)
- Any interface modules seem to be causing issues as well
The downside is, you may lose some level of functionality (if you were even using that particular plug-in) from Quicksilver. The upside is that you’ll still be able to run Quicksilver, if somewhat pared-down in capability. In my case, I didn’t lose anything I’d been using regularly. In the event that you have actually lost something of use to you, you can either downgrade back to OS 10.5.8 (probably not realistic), or wait on Google QSB to get up to speed, as I have seen some great strides on that application in a short period of time.
I was recently asked in a tweet, “Have you completely abandoned Windows now?” I realized, with some genuine surprise, that not only had I stopped using Windows in any meaningful way, but actually stopped using it the moment I got my hands on my first (modern) Mac in 2008. So, in the aftermath of last months much-anticipated release of Snow Leopard, I find myself thinking about the move I made between operating systems, and my experiments since then with Microsoft’s latest offering.
First, a little background. I flirted with Macs in high-school Graphic Art lessons and then again, very very briefly, in the late 90s when a colleague handed me a PowerBook and said “Here, you could use that if it’s any good, but I don’t know if it even works…” It did work, but to be honest, it really wasn’t any use to me at all. Anyway, even if I had wanted to use the Mac, everyone I knew was working on a Windows PC of some flavor or other, and though the PowerBook had a working copy of Microsoft Office (such as it was in those days) incompatibilities were an ever-present problem.
Here’s an example of a conversation I had, many times, with the one person I knew who used a Mac;
Gloria: Liam, that file you sent me…
Liam: Yes, the Word document.
Gloria: It doesn’t work properly. I’ve lost all the formatting.
Liam: What do you mean, you’ve ‘lost’–
Gloria: It’s a mess.
Liam: Did you open it using Word?
Gloria: Of course I did! Look, will you just paste the text into an email for me, yeah?
I used to think the problem lay not with her computer, but more with her inability to use it. I later saw for myself, however, that she was absolutely right. Word documents created on a Windows PC didn’t fare well in Word on her Mac. Crazy.
The short of it is that I used Windows for everything and I had no compelling reason to want to switch. At home and at work, even on the road with my Windows Mobile devices through the years, I was 100 percent a Microsoft customer. Throughout those years, every experience I had with the Mac was a bad experience. And it was usually, as in the example above, related to the same tedious issue — incompatibility.
By 2005 I’d certainly heard about Mac OS X, though the closest I came to it was reading Paul Thurrott’s reviews and opinions on his SuperSite for Windows. He spoke of a decent OS, but reassured me that I was missing nothing. Then Apple released the iPhone and, despite my aversion to all things Apple, the Geek in me couldn’t resist and I bought one.
The experience on the iPhone was simply amazing, far better than any I’d had on any other device in… well, forever. It made me question my assumptions about the Mac. So in the summer of 2008, I wandered into the Apple Store on London’s Regent Street and spent a half hour pratting-about on different machines. I left with a MacBook. And on that very day, Windows died for me. Leopard was a breath of fresh air.
But let me be clear; I didn’t switch because I felt the Mac was a superior platform. Honestly, I feel that, for the majority of people, it’s no better or worse than Windows at the mechanics of making email, word processing and web surfing possible. I switched because it offered a far superior experience in doing those everyday things. When I think about Windows and where it fails for me, it always comes down to that same issue; experience.
Despite the “XP” in its 2001 OS name, it was only with Windows Vista that Microsoft finally seemed to “get” that user experience matters. Yet, beyond Vista’s eye candy there’s not a lot in the way of a unified, cohesive and organic experience that makes me want to use it as my everyday computing environment. This isn’t blind fanboy-ism talking; I used Vista since its Longhorn days right up until last summer, so I know I gave it a long-enough evaluation!
The user experience in Windows 7, too, has not changed since Vista, save perhaps for the addition of some fiddly new UI gimmicks (Aero Peek anyone?). To me, 7 ‘feels’ just like Vista did. I keep moving around the OS hoping to have an epiphany; “Aha! There’s the cohesive, rewarding experience I was searching for!” — but it just doesn’t happen.
I want to like Windows 7, but after trying various beta builds for the last year and repeatedly doing my best to enjoy it, I found myself feeling relieved whenever I returned to the elegant lines of Mac OS X.
I don’t hate Windows 7. I don’t think it’s shoddy, unattractive or fundamentally flawed. But just as Thurrott would say of Snow Leopard, when it comes to Windows 7 there’s just not much there. Windows 7 is a perfectly capable operating system that looks nice and gets the job done. Ultimately, however, it’s just not very interesting and, for recent switchers to the Mac, it’s too little, too late.
Read the Report, "Surveying the Mobile App Store Landscape." Only at GigaOM Pro.