Tuesday, October 29, 2013

Human Capital Management (HCM)

Human Capital Management (HCM)
Human capital management (HCM) is an approach to employee staffing that perceives people as assets (human capital) whose current value can be measured and whose future value can be enhanced through investment.
The term human capital management can be controversial because the word "capital" has an impersonal connotation, implying that employees are simply an expensive operating cost that should be minimized whenever possible. A responsible human capital management strategy, however, is built upon the understanding that an organization's employees are its most valuable asset -- and spending time and energy on keeping records that allow managers to effectively manage staff development and promote employee engagement will help the organization achieve both its short and long-term monetary goals.
Successful human capital management requires a lot of documentation and HCM software can streamline and automate many of the day-to-day record-keeping processes. When an organization evaluates an HCM system investment, it must weigh the benefits of a standalone HCM approach against those of an all-in-one enterprise resource planning (ERP) suite that includes HCM modules. In a large enterprise, having one integrated platform with a single database for everything can save on the cost of maintaining and upgrading individual software applications and application program interfaces (APIs). In a small or midsize company, however, it may just be easier to manually enter the same data into multiple systems.

Tuesday, October 15, 2013

Cloud Backup

Cloud Backup
Cloud backup, also known as online backup, is a strategy for backing up data that involves sending a copy of the data over a proprietary or public network to an off-site server. The server is usually hosted by a third-party service provider, who charges the backup customer a fee based on capacity, bandwidth or number of users.
Online backup systems are typically built around a client software application that runs on a schedule determined by the level of service the customer has purchased. If the customer has contracted for daily backups, for instance, then the application collects, compresses, encrypts and transfers data to the service provider's servers every 24 hours.
To reduce the amount of bandwidth consumed and the time it takes to transfer files, the service provider might only provide incremental backups after the initial full backup.

Monday, October 14, 2013

Sock Puppet Marketing

Sock Puppet Marketing
Sock puppet marketing is the use of a false online identity to artificially stimulate demand for a product, brand or service.  The false identity is called a sock puppet.
A primary goal of sock puppet marketing is to increase sales by posting positive comments about a product, service or brand on web sites. Alternatively, a sock puppet might be used to post negative comments that denigrate a competitor.
Sock puppet marketing and sock puppetry in general are unethical. When exposed, sock puppet marketing can damage the reputation and brand of a product or service.

Thursday, October 10, 2013

Dynamic Pricing

Dynamic Pricing
Dynamic pricing, also called real-time pricing, is an approach to setting the cost for a product or service that is highly flexible.
The goal of dynamic pricing is to allow a company that sells goods or services over the Internet to adjust prices on the fly in response to market demands. Changes are controlled by pricing bots, which are software agents that gather data and use algorithms to adjust pricing according to business rules. Typically, the business rules take into account such things as the time of day, day of the week, level of demand and competitors' pricing.
With the advent of big data and big data analytics, business rules can be crafted to adjust prices for specific customers based on criteria such as the customer's zip code, how often the customer has made purchases in the past and how much the customer typically spends.

Wednesday, October 9, 2013

Wireshark

Wireshark
Wireshark is an open source network forensics tool for profiling network traffic and analyzing packets. Such a tool is often referred to as a network analyzer, network protocol analyzer or sniffer.
Wireshark is a popular tool for testing basic traffic transmission, analyzing bandwidth usage, testing application security and identifying faulty configurations. The tool is quite versatile, allowing network administrators to examine traffic details at a variety of levels.
Because Wireshark is open source, its filters can be tailored to the unique needs of a specific enterprise network.

Tuesday, October 8, 2013

Big Data Management

Big Data Management
Big data management is the organization, administration and governance of large volumes of both structured and unstructured data.
The goal of big data management is to ensure a high level of data quality and accessibility for business purposes. By examining data from a variety of sources -- including call detail records, system logs and social media sites -- a company can gain insight into what business processes need improvement and how to gain a competitive advantage.
As part of the process, the company must decide what data must be kept for compliance reasons, what data can be disposed of and what data should be kept for in-memory analysis. The process requires careful data classification so that ultimately, smaller sets of data can be worked with.

Thursday, October 3, 2013

Application Delivery Controller

Application Delivery Controller
An application delivery controller (ADC) is a network device that manages client connections to complex Web and enterprise applications. In general, a controller is a hardware device or a software program that manages or directs the flow of data between two entities.
An ADC essentially functions as a load balancer, optimizing end-user performance, reliability, data center resource use and security for enterprise applications. Typically, ADCs controllers are strategically placed to be a single point of control that can determine the security needs of an application and provide simplified authentication, authorization and accounting (AAA).
An ADC can accelerate the performance of applications delivered over the wide area network (WAN) by implementing optimization techniques such as compression and reverse caching. With reverse caching, new user requests for static or dynamic Web objects can often be delivered from a cache in the ADC rather than having to be regenerated by the servers.

STONITH (Shoot The Other Node In The Head)

STONITH
STONITH (Shoot The Other Node In The Head) is a Linux service for maintaining the integrity of nodes in a high-availability (HA) cluster.
STONITH automatically powers down a node that is not working correctly. An administrator might employ STONITH if one of the nodes in a cluster can not be reached by the other node(s) in the cluster.
STONITH is traditionally implemented by hardware solutions that allow a cluster to talk to a physical server without involving the operating system (OS). Although hardware-based STONITH works well, this approach requires specific hardware to be installed in each server, which can make the nodes more expensive and result in hardware vendor lock-in. A disk-based solution, such as split brain detection (SBD), can be easier to implement because this approach requires no specific hardware. 

Tuesday, October 1, 2013

Kyoto cooling

Kyoto cooling
Kyoto cooling, also called the Kyoto wheel, is an energy-efficient free cooling method for data centers developed in the Netherlands.
Kyoto cooling uses outside air to remove the heat created by computing equipment instead of using mechanical refrigeration. Compared to the energy required by traditional computer room air conditioners, computer room air handlers and other traditional cooling methods, Kyoto cooling uses between 75% and 92% less power. Kyoto cooling is named after the Kyoto protocol, an international environmental impact agreement.
The Kyoto cooling method uses a thermal wheel that contains a honeycomb lattice made out of heat-absorbent material. The wheel, which is half inside and half outside the building, removes heat from circulating air by picking up heat from the data center and then releasing it into the cooler outside air as the wheel rotates. The patented Kyoto method uses the energy transferred by the honeycomb system to run small fans that help pull air through each half of the system. It also takes advantage of the hot and cold aisle concept to completely isolate the flow of hot and cold air going to and from the wheel.