Archive for April, 2009
After quitting my job, I decided to start smoking a pipe. No, I didn't start smoking the wacky tabaccy. I had quit smoking cigarettes in September (13th 2008). I'm completely quit. I don't want to smoke cigarettes, but I do want to do something like it. I think it has to do with keeping your hands occupied or relieving stress, because smoking a pipe does both and I feel a lot better since I took it up.
Now, I don't inhale the smoke. I just "puff" on the pipe. I know I still get some of the nicotine from the smoke but at least I'm not getting the harmful effects on my lungs. I'm sure there is some danger from the pipe smoke but the stress relief outweighs the risks completely.
With all that in mind, I have a great suggestion to anyone who is reading my little blog. Cade's Cove Cavendish is absolutely the best pipe tobacco you can get. The only place to get it is the Gatlinburlier which is a tobacco shop in Gatlinburg, TN. If you are a pipe smoker (the tobacco sort), you HAVE to try it. You can get it online at http://gatlinburlier.com. I don't make a dime from that link, so don't think this is some kind of scam. That's honestly THE best pipe tobacco I've ever tried. If you have suggestions for the best pipe tobacco or think that your pipe tobacco is better. I'd love to try it and give a review. Until that time, "Cade's Cove Cavendish" is the best.
Once you have a following on Twitter, it's easy to gather a little extra traffic to your site from it. To help automate the process, I make use of Twitter's API and a Linux command line. I create some cron jobs to update my status using curl. This is pretty simple to do and may be helpful for people with a Linux box and the need to advertise something.
The curl command is structured as follows:
curl -u username:password -d status="My new status message" http://twitter.com/statuses/update.xml
Now it's important to note that for the automated crons the returned xml isn't really needed. You can also use this in the programming language of choice to fetch the xml and make use of it. You can also get JSON results by changing the end from .xml to .json.
So, once you have the code, all you have to do is create the cron jobs in Linux. Edit the crontab with
Your default editor should open your cron. Here is an example showing how to create the cron job:
5 * * * * curl -username:password -d status="My Message" http://twitter.com/statuses/update.xml
That cron job would run at 5 minutes after the hour, every hour, every day. This is, however, not a good idea because your account will not last long 🙂
I'm not a very good programmer. I think the biggest reason for that is that I've not had enough practice at it. I've written plenty of apps and web sites but most were very simple. My latest app, whats-hot-weekly.com is actually a simplified version of another app I wrote that is located at givemeaniche.com. There are many differences between the two apps even though they basically do the same thing. The exception to this being that givemeaniche actually shows the most searched for terms as well as the most watched items.
The hardest part of it all is coming up with new ideas for serious work projects. I have a few but being a solo developer, designer, etc means that I'll have to put some time into them. Any ideas for apps and websites would be much appreciated.
I first tried twitter about a month ago. I wasn't very impressed, at the time. I have this theory that we are actually going backward in usefulness on the web. Back in the 90's we had web apps that did a lot more than twitter, yet twitter is supposed to be sooooo amazing.
Well it is. Here's the thing. As the internet get more and more crowded, certain things become more and more popular. Marketing is getting extremely popular on the internet. That's not to say that marketing hasn't been popular on the internet for many years. It's just getting exponentially more popular. Twitter is a marketers dream. Facebook is another that works great for marketers.
You see, these apps are just less robust versions of old forum software. They are also meant to grab a larger audience where forums are meant for a smaller niche. This broad audience and simplistic design make these apps extremely useful for advertising. Unfortunately, the normal user will see less functionality from these apps than software from 15 years ago. Those users won't understand this, however, because they have never been exposed to that older technology. They just know that all of their family is on facebook or myspace. So they join the sites and start getting bombarded with new advertisements.
If you are smart, you'll jump on the marketing bandwagon soon. There's a lot of money in it.
A few weeks ago I posted a couple of times about the xlack system info script for xchat. It was very useful considering I couldn't find the script anywhere on Google. Right after posting the download for the script, I had a decent rank on Google for the keyword "xlack download" and even "xlack". This was to be expected since there really wasn't a lot of information about Xlack available on the internet anymore.
Now I'm not even in the first page results. In fact, the number one position for "xlack" is a placeholder page for the old xlack.tk. This was the original website for the xlack script. There's also a high placement for a member of deviant art. At any rate, if someone were to actually use Google to find the xlack script, they would be hard pressed to find it. Hopefully this site will rank for the xlack keyword soon so people can actually find the script.
I was just using my new tool at whats-hot-weekly.com and I noticed that the most watched item in Computers & Networking has changed a bit. The latest, greatest selling item is a web camera. The price simply can't be beat. It's $1.00. It has a built in mic and would be great for Skype and such. I'm not sure how many they have left but they've sold over 8,000 of them according to whats-hot-weekly.com.
Finally, here is the link to the web camera in question. Notice that it's only a 300k camera, but that is VGA quality which should be fine. I mean...come on....it's a buck.
Ok I have yet another reason to loath IE7. I had an error on Whats-hot-weekly.com earlier in regard to this error message on my ajax XML object:
responsexml.documentElement is null or not an object
Everything worked in Firefox, Chrome, and even my Nokia N810's browser, but IE7 just wasn't going to cooperate. I knew that it was working last night, but I made a few changes before going to bed.
Two of the changes I made was in relation to the meta tags in the HTML of the main index page. I added keywords and description for search engine optimization. Little did I know that one of these was the culprit.
After much Googling, I came upon this article. In it I immediately found the reason for the error. It should have been a little obvious but I had overlooked something simple. My search box for entering in keywords had an id(and name) of "keywords". The new meta tag for keywords also used the "name=keywords" attribute.
In the article, it was the "description" meta tag that had caused the conflict. Upon evaluating that, I realized my mistake. This effected IE only because IE checks the name attribute when you call document.getElementById. It was grabbing the meta element instead of the search box.
I hope this helps someone else, in the event that all the variables fall into place for this to happen again.
Well, after a few days of serious development, I'm releasing the beta of my site. It's called whats-hot-weekly.com. It is basically a "most watched items on ebay" site. With it users can find what everyone else is looking at.
One can search by keyword/keyphrase, by category, or by both. The interface may need a little polishing, but it's functional. There is at least one minor bug that I'm working on. The site will continue to improve, and I'll be adding forums soon, so that people can report bugs and discuss what they find.
Don't forget to let me know what you think by commenting here or by emailing me at the address listed on the site's home page.
Any of you that are AJAX developers can skip this post. I've just recently started concentrating on AJAX for a project I'm working on. In the past, I've used PHP to parse XML returned from various web services and it goes off without a hitch. I also wrote my own web service in PHP for this project.
The project is basically a site that works with the eBay API to pull the most watched items for any keyword search phrase and by category. I've pulled all the categories to a local database. This will allow me to avoid using an API call every time someone clicks through different categories. All the categories are in an AJAX menu system I wrote. The menu is completely dynamic and loads the categories and subcategories using calls to my web service. This part was fun because it gave me the opportunity to create a web service and this in itself was worth the time I've spent on the whole project.
I ran into a road block however, because I wasn't aware that AJAX doesn't allow cross-domain calls. At least from what I see, it doesn't. I started getting an error 1012 "Access to restricted URI denied'." This means that the actually calls I would like to make to the eBay API won't work through AJAX. I wrote the code to try to do so and kept getting this error. That's when I found out that it wasn't possible to do this using AJAX. It's fine in PHP, however. So, here is the work-around I'm brain-storming. I know it'll work. It's just a matter of doing it.
The work-around is simply to write another web service in php that makes the calls for me. Then use AJAX to pull the info dynamically from the localhost. The up side to this is I can also log various stats about the calls within the web service as well. I could create a table in my database to log the searches, time of day, IP address of user, and so on. This will allow me to understand how the app is being used, who's using it, when they are using it, and what they are using it for.
This is also known as using AJAX through a proxy. Where the proxy is the web service on the localhost from which one can make an AJAX call to.
So, I'm off to code another web service.
Even though there are many ways to write Blu-Ray, they all mean one thing, high definition video and great audio. Mix that with the best cinematography every thrown into a documentary and you end up with Planet Earth Blu-Ray edition. To me, this was what Blu-Ray was invented for.
The Discovery Channel went all out with this documentary. It is an eleven part mini series that explores the mountains, oceans, deserts, jungle, polar, and fresh water regions of our favorite planet. In the process they display the best visuals you'll ever see in a documentary. I saw this Blu-Ray set in Walmart a week or so ago for around $70. The list retail price is $100. I found it for $50 here. Not even Amazon has this good of a deal.
No items matching your keywords were found.