Archive for September, 2009

Wikipedia: A credible source?

Most people who have attended college in the last few years know that colleges don’t like Wiki used as a source to papers. Their reasoning is sound, because Wikipedia is, after all, community-based. Anyone can make changes to a wiki article.

However, I tend to agree with the open source way of thinking. More eyes on the issue makes it less likely that someone can take advantage of it. Let me explain.

Security is actually a big selling point of open source software. Anyone can look at the source code. This means that anyone could easily see where the software could be exploited. At first, it sounds like open source would be bad for security. All those prying eyes will find all kinds of ways to exploit the software right? Actually the answer is no. Those prying eyes help keep everyone in check. You see, if you contribute to a piece of open source software and you add some malicious code, someone will spot it, correct it, and report you.

The same concept goes for Wikipedia, in my book. If an article makes a false statement, someone will normally find it and correct it. With potentially millions of people editing each article, it becomes increasingly difficult for the article to be biased.

Universities may not accept Wikipedia as a creditable source, but it’s still an excellent choice for finding information. In fact, it’s probably more accurate than most of the citations the schools will accept.

I mean think about it. They will accept a citation from CNN.com or FoxNews.com but won’t accept a citation of Wiki? If anything CNN and Fox are much more biased. So, while I understand that Wikipedia isn’t a concrete source, neither are other websites that are accepted. In fact, most sources are biased. Unless the source tells only the fact, there is probably an opinion mixed in there somewhere. That’s what journalists do. That’s what scientists do. What are the differences between Wikipedia and Encarta? One is maintained by millions of people who review each other’s work. The other is a proprietary medium which is closed source and only a few people can review the contents. I prefer wiki.

, ,

No Comments

Warning: simplexml_load_file() [function.simplexml-load-file]: URL file-access is disabled in the server configuration

If you’ve seen that error message you’ve probably happened upon a security feature that your shared web hosting provider has enabled. There are a few work-arounds for this error but most require you to have certain privileges on the server that you probably don’t have. Quite frankly, if you are getting these errors you probably don’t have the ability to change these settings yourself.

Rather than try to get the provider to change these settings (let’s face it, they have this enabled for a reason and surely someone else has already tried to get this changed, right?) one can easily get around this with Curl. In most cases, curl will be enabled on the server. So here is the quick and dirty way to get around it:

Create a PHP file and name it anything you want. For the sake of this article we’ll refer to it as curl_functions.php. In this file put the following functions:

<php
function setupMyCurl() {
   $myCurl = curl_init(); 
   $temp = curl_setopt($myCurl, CURLOPT_RETURNTRANSFER, 1);
   return($myCurl);
}
define("myCurl", setupMyCurl());
function curl_get_contents($url) {
   $temp = curl_setopt(myCurl, CURLOPT_URL, $url);
   return(curl_exec(myCurl));
}
?>

Include or require this file. Then, all you have to do is use the curl_get_contents($url) in your code to pull in the xml to a string. Then use the simplexml_load_string() instead of simplexml_load_file(). This will give you the same results but works around the url fopen feature. If you don’t have curl enabled on your host, GET ANOTHER HOST. 🙂

, ,

2 Comments