php

Web services are taking over the whole world. I credit Twitter's epic rise to the availability of a simple but rich API. Why not use the same model for your own sites? Here's how to create a basic web service that provides an XML or JSON response using some PHP and MySQL.


/* require the user as the parameter */
if(isset($_GET['user']) && intval($_GET['user'])) {

 /* soak in the passed variable or set our own */
 $number_of_posts = isset($_GET['num']) ? intval($_GET['num']) : 10; //10 is the default
 $format = strtolower($_GET['format']) == 'json' ? 'json' : 'xml'; //xml is the default
 $user_id = intval($_GET['user']); //no default

 /* connect to the db */
 $link = mysql_connect('localhost','username','password') or die('Cannot connect to the DB');
 mysql_select_db('db_name',$link) or die('Cannot select the DB');

 /* grab the posts from the db */
 $query = "SELECT post_title, guid FROM wp_posts WHERE post_author = $user_id AND post_status = 'publish' ORDER BY ID DESC LIMIT $number_of_posts";
 $result = mysql_query($query,$link) or die('Errant query:  '.$query);

 /* create one master array of the records */
 $posts = array();
 if(mysql_num_rows($result)) {
  while($post = mysql_fetch_assoc($result)) {
   $posts[] = array('post'=>$post);
  }
 }

 /* output in necessary format */
 if($format == 'json') {
  header('Content-type: application/json');
  echo json_encode(array('posts'=>$posts));
 }
 else {
  header('Content-type: text/xml');
  echo '<posts>';
  foreach($posts as $index => $post) {
   if(is_array($post)) {
    foreach($post as $key => $value) {
     echo '<',$key,'>';
     if(is_array($value)) {
      foreach($value as $tag => $val) {
       echo '<',$tag,'>',htmlentities($val),'</',$tag,'>';
      }
     }
     echo '</',$key,'>';
    }
   }
  }
  echo '</posts>';
 }

 /* disconnect from the db */
 @mysql_close($link);
}
With the number of persons hitting your web service (hopefully), you'll need to do adequate validation before attempting to connect to the database to avoid injection attacks. Once we get the desired results from the database, we cycle through the results to populate our return results array. Depending upon the response type desired, we output the proper header and content in the desired format.
Take the following sample URL for example:
http://mydomain.com/web-service.php?user=2&num=10
Now, we can take a look at the possible results of the URL.

The XML Output

<posts>
 <post>
  <post_title>SSLmatic SSL Certificate Giveaway Winners</post_title>
  <guid>https://davidwalsh.name/?p=2304</guid>
 </post>
 <post>
  <post_title>MooTools FileManager</post_title>
  <guid>https://davidwalsh.name/?p=2288</guid>
 </post>
 <post>
  <post_title>PHPTVDB: Using PHP to Retrieve TV Show Information</post_title>
  <guid>https://davidwalsh.name/?p=2266</guid>
 </post>
 <post>
  <post_title>David Walsh: The Lost MooTools Plugins</post_title>
  <guid>https://davidwalsh.name/?p=2258</guid>
 </post>
 <post>
  <post_title>Create Short URLs Using U.Nu</post_title>
  <guid>https://davidwalsh.name/?p=2218</guid>
 </post>
 <post>
  <post_title>Create Bit.ly Short URLs Using PHP</post_title>
  <guid>https://davidwalsh.name/?p=2194</guid>
 </post>
 <post>
  <post_title>Represent Your Repositories Using the GitHub Badge!</post_title>
  <guid>https://davidwalsh.name/?p=2178</guid>
 </post>
 <post>
  <post_title>ZebraTable</post_title>
  <guid>https://davidwalsh.name/?page_id=2172</guid>
 </post>
 <post>
  <post_title>MooTools Zebra Table Plugin</post_title>
  <guid>https://davidwalsh.name/?p=2168</guid>
 </post>
 <post>
  <post_title>SSLmatic: Quality, Cheap SSL Certificates and Giveaway!</post_title>
  <guid>https://davidwalsh.name/?p=2158</guid>
 </post>
</posts>
Take this next sample URL for example:
http://mydomain.com/web-service.php?user=2&num=10&format=json
Now, we can take a look at the possible results of the URL.

The JSON Output

{"posts":[{"post":{"post_title":"SSLmatic SSL Certificate Giveaway Winners","guid":"http:\/\/davidwalsh.name\/?p=2304"}},{"post":{"post_title":"MooTools FileManager","guid":"http:\/\/davidwalsh.name\/?p=2288"}},{"post":{"post_title":"PHPTVDB: Using PHP to Retrieve TV Show Information","guid":"http:\/\/davidwalsh.name\/?p=2266"}},{"post":{"post_title":"David Walsh: The Lost MooTools Plugins","guid":"http:\/\/davidwalsh.name\/?p=2258"}},{"post":{"post_title":"Create Short URLs Using U.Nu","guid":"http:\/\/davidwalsh.name\/?p=2218"}},{"post":{"post_title":"Create Bit.ly Short URLs Using PHP","guid":"http:\/\/davidwalsh.name\/?p=2194"}},{"post":{"post_title":"Represent Your Repositories Using the GitHub Badge!","guid":"http:\/\/davidwalsh.name\/?p=2178"}},{"post":{"post_title":"ZebraTable","guid":"http:\/\/davidwalsh.name\/?page_id=2172"}},{"post":{"post_title":"MooTools Zebra Table Plugin","guid":"http:\/\/davidwalsh.name\/?p=2168"}},{"post":{"post_title":"SSLmatic: Quality, Cheap SSL Certificates and Giveaway!","guid":"http:\/\/davidwalsh.name\/?p=2158"}}]}
Creating a basic web service is very simple and encourages your users to spread the word about your website or service. Want more traffi

7 Best WordPress Maintenance Services

WordPress Maintenance Services

When it comes to running an online business, time management is key. With so many tasks that need completing, it’s easy to feel like there aren’t enough hours in the day.
The reality of business, however, is that some tasks are more important than others. Client prospecting, strategizing, and analyzing your marketing channels will all grow your business and add to your bottom line – so will actually completing the client work that you’re paid to do, of course.


Ten years ago, power usage at data centers was growing at an unsustainable rate, soaring 24% from 2005 to 2010. But a shift to virtualization, cloud computing and improved data center management is reducing energy demand.

According to a new study, data center energy use is expected to increase just 4% from 2014 to 2020, despite growing demand for computing resources.
Total data center electricity usage in the U.S., which includes powering servers, storage, networking and the infrastructure to support it, was at 70 billion kWh (kilowatt hours) in 2014, representing 1.8% of total U.S. electricity consumption.
Based on current trends, data centers are expected to consume approximately 73 billion kWh in 2020, becoming nearly flat over the next four years. "Growth in data center energy consumption has slowed drastically since the previous decade," according to a study by the U.S. Department of Energy's Lawrence Berkeley National Laboratory. "However, demand for computations and the amount of productivity performed by data centers continues to rise at substantial rates."
Improved efficiency is most evident in the growth rate of physical servers.
From 2000 to 2005, server shipments increased 15% each year, resulting in a near doubling of servers in data centers. From 2005 to 2010, the annual shipment increases fell to 5%, but some of this decline was due to the recession. Nonetheless, this server growth rate is now at 3%, a pace that is expected to continue through 2020.
The reduced server growth rate is a result of the increase in server efficiency, better utilization thanks to virtualization, and a shift to cloud computing. This includes concentration of workloads in so-called "hyperscale" data centers, defined as 400,000 square feet in size and above.

                  OUR TEAM HISTORY

Reliablesoft.net launched in 2002 when search engines were at their early stages; Google was a small but very promising company and you could still hear words like Lycos, Excite and AOL. Browsers were primitive (think Internet Explorer and Netscape) and search engine optimization (SEO) had a completely different meaning. Since then many things have changed and Internet marketing became a necessity for any company that wants to survive in the competitive online World.
Reliablesoft.net was fortunate enough to deal with the technologies around web marketing from the very beginning. After hours of research and development, trial and error we can now offer you the most reliable solutions in the world of Digital Marketing.
Our headquarters are located in Europe but the majority of our clients are in the United States, Canada, United Kingdom, Australia and China. Efficient results, customer care and support is our main concern, at least this is what our customers say.

                Our Management Team


Mahbub, Digital Marketing Manager
mahbub has more than 12 years of practical experience with software and web development, SEO, Social media marketing, ecommerce and Internet Marketing. He studied Computing, Multimedia and eCommerce in the UK (1997-2001).  In 2002 he completed the Associate Bankers Diploma (American Institute of Bankers) and he is also a Microsoft Certified .Net developer. Over the years he has worked extensively with different technologies (Microsoft, Oracle, IBM, SAP) as a software developer, analyst and project manager.
Since 2000 Alex has been actively involved with web development, SEO and social media. He developed a number of web properties and applied his own flavor of SEO and Internet marketing practices with remarkable results.
Which are the 10 best and most popular search engines in the World? Besides Google and Bing there are other search engines that many not be so well known but still serve millions of search queries per day.
It may be a shocking surprise for many people but Google is not the only search engine available today on the Internet! In fact there are a number of search engines that try to remove Google from its throne but none of them is ready (yet) to even pose a threat. Nevertheless, there are search engines that are worth considering and the top 10 are presented below after the break.

SEARCH ENGINE OPTIMIZATIONOptimize your website for the Top Search Engines and get more traffic from Google, Yahoo and Bing. Easy to understand guide perfect for beginners to SEO.
  1. Google – No need for further introductions. The search engine giant holds the first place in search with a stunning difference of 45% from second in place Bing. According to the latest comscore report (October 2012) 69.5% of searches were powered by Google and 25% by Bing. Google is also dominating the mobile/tablet search engine market share with 89%!
                                     The best practice for any kind of site

 To know our policy  stay with us

In the blink of an eye, new tools come and go. There are a million out there, and most new and experienced SEOs are always asking – what are the best tools out there? Which tools are definite “must haves” and which should I shelve?
In 15 years as a digital marketer, I’ve been particularly interested in searching for powerful tools that can go a long way. I’ve always believed that by revealing the right data, a good tool can open doors. So without further ado, here is a list of some of the best tools I’ve discovered, and why.

SEMrush

The first, and most powerful, way we use SEMrush is to determine if a site is healthy or not. Whenever we are doing outreach and working on link-building campaigns, we always check SEMrush to see how many keywords a site is ranking for in Google. When we find sites that have a high volume of inbound links and DA, but no keywords ranking in SEMrush, then it’s likely they’ve been penalized. We also check their organic traffic to determine if there’s a massive drop; if there is, we know to avoid reaching out to that site.
semrush-traffic
We also use SEMrush as part of our keyword research. We find the top sites ranking for a primary keyword, then see what keywords those sites are ranking for. When you combine the keyword list with a few sites in the industry, you end up with a pretty comprehensive keyword list.

BuzzSumo

Most don’t think of BuzzSumo as an SEO tool, but as a social media tool. Yet as SEO and content marketing are becoming indelibly linked, BuzzSumo is now a powerful tool in an SEO’s arsenal.
We use BuzzSumo when searching ideas for content – visual, social, and blog content. After creating a database of keywords, we then look for those keywords in BuzzSumo to brainstorm creative ways of crafting titles. When brainstorming for ideas for new infographics, BuzzSumo is usually our first stop. When figuring out popular ideas for Pinterest story boards, BuzzSumo data can be invaluable.

And if you click on the “Backlinks” tab, you can even see the social shares of the most popular inbound links. What an amazing way to create a list of sites to reach out to – looking past the usual SEO metrics, you can find popular pages that, if you can acquire a link, will generate tons of social buzz.

AuthorityLabs

With AuthorityLabs, you can track historical rankings and see graphical representations of how a keyword is ranking over time.
authority-labs-keywords
Most experienced SEOs know this should never be an isolated KPI, as the data varies based on personalization, data center, etc. However, it can be used as a mechanism for seeing how a keyword behaves over time given the consistency in the data provided by AuthorityLabs. I also love that you can see multiple sites at once to compare, so you can see how competitors are faring against you especially if you research their marketing strategy.

SpyFu

One of the ways I use SpyFu is for competitive research. When I start working with a new site, I check out what they’re doing on organic and PPC, and it becomes a fantastic place to start analyzing their strategy.

In this overview report, I can instantly see what keywords they’re paying for on PPC, the CPC, daily cost, etc. Similarly I can also see what keywords they’re ranking for organically, how much traffic they’re drawing in, and much more.
There are many other reports you can pull using SpyFu, as you can see in the screenshot. One of my favorites of these reports is the ad history – when starting a new PPC campaign, that data can be invaluable to search for inspiration on ad creation.
The SpyFu backlink tool is not the most powerful on the market, but as an add-on to some of the other indispensable features, it can be helpful, too. The contact feature can save a bit of time when doing outreach, but again, isn’t as comprehensive as needed and shouldn’t be used as a standalone tool.
spyfu-backlinks

Ahrefs

When it comes to backlink research and link data for link audits, I find ahrefs data to be extremely powerful. I use Ahrefs daily to find out the following things:
  • How many referring domains does a site have?
  • How many of those come from unique IPs?
  • How many of those links are from .gov or .edu sites?
  • How natural is their link growth over time?ahrefs-link-growth
  • What are their most authoritative links (sorted by Domain Rank)?ahrefs-domain-rank
  • We also use this data when trying to map out our link-building strategy for clients. We identify the top ranking sites and map out how many links in each domain rank range they have. This ahrefs breakdown tool makes it easy:ahrefs-breakdown
  • Are there sitewide links, or instances when there are too many links from a single domain?ahrefs-single-domain
  • For sites that’ve had penalties or people interested in monitoring new backlinks, they can watch new links gained in the last day, previous day, seven, 30 or 60 days.ahrefs-date-range
Other ways you can use Ahrefs is to check for anchor text density, broken links, lost links, to export link data for a Google Penguin audit, and much more.
Ahrefs has also been launching new features, such as top content and top referring content, and is rapidly expanding beyond a link analysis tool.

Majestic

My favorite aspect of Majestic is of course the trust flow and citation flow metrics. Most importantly, the topical trust flow always helps when analyzing a site’s topical data when working on semantic context for a site.

When working on outreach and link-building campaigns, we identify top ranking industry sites and gather the trust and citation flow, as well as the topical trust flow numbers, in order to focus on mirroring a similar profile.
We also use Majestic for similar data as Ahrefs:
  • To download a list of backlinks when working on link audits
  • To check for referring domains and their backlinks, trust flow, citation flow, and topical trust flow
  • To identify new and lost links (to try and recover them or disavow them if they’re low quality)

DeepCrawl

When it comes to on-page and Panda audits, DeepCrawl is my right-hand man.
deepcrawl
As you can see in the image above, with DeepCrawl you can diagnose all sorts of technical and on-page issues, from duplicate pages, pages with meta titles that are too long to pages that take too long to load, duplicate pages, etc.
There are dozens of valuable items to be identified using DeepCrawl, information that would be difficult to acquire. In the past I had to combine ScreamingFrog, URLprofiler, Copyscape, and a massive amount of Excel formulas and data hacking. With DeepCrawl, most of the potential on-page issues become evident and can be identified BEFORE they become a problem (if you are running a crawl regularly).
It’s a great tool to diagnose particular pages, as at a glance you can see detailed information such as whether it has an H1, how many internal and external links, what tags each page has, etc.

When it comes to SEO tools, there are some amazing ones already on the market. One thing I’ve learned, however, is that the best tools are the ones that your team creates specifically for your needs.
If you have talented programmers on board, don’t shy away from creating your own tools from scratch. It can save you tremendous amounts of time and energy. When we finally got tired of trying to use outreach tools that simply couldn’t support our volume, we created our own in-house outreach tool.

Instead of switching back and forth between a bunch of different tools, we’ve worked with programmers to streamline all our SEO processes, whether that’s compiling and organizing data from outside tools, optimizing AdWords scripts, or simply creating extensions that allow us to simplify the project management side of things.
What about you – what are some of your favorite SEO tools? I love discovering new tools – so please share your favorites!