Sunday, June 17, 2012


Electrical pollution


Electrical pollution is electromagnetic-field (or EM-field) energy emanating from electrical wiring. In most places, the majority of this energy exists at 60 Hz (hertz), resulting from the AC (alternating current) that constantly flows in the utility wiring outdoors, indoors, and inside common appliances.


Not all of the energy in utility electricity occurs at the standard AC frequency of 60 Hz (50 Hz in some locations). Emissions also take place at various other frequencies. These emissions, sometimes called dirty electricity, result from the use of appliances that generate irregular waveforms and transmit the resulting currents into the utility wiring. Vacuum cleaners, hair dryers, fluorescent lamps, and some consumer electronic devices produce this type of energy.


In recent years, controversy has arisen concerning alleged adverse health effects from electrical pollution. Some scientists believe that electrical pollution has been a major cause of human ailments ever since the very first use of AC utility electricity (around the year 1900). Others deny that any conclusive evidence exists for adverse health effects in humans.

Wednesday, June 13, 2012


NoOps

NoOps (no operations) is the concept that an IT environment can become so automated and abstracted from the underlying infrastructure that there is no need for a dedicated team to manage software in-house.
Traditionally in the enterprise, an application development team is in charge of gathering business requirements for a software program and writing code. The development team tests their program in an isolated development environment for quality assurance (QA) and -- if requirements are met -- releases the code to an operations team who deploys and maintains the program from that point on. In a NoOps scenario, maintenance and other tasks performed by the operations team would be automated.
Forrester coined the term NoOps, which they define as "the goal of completely automating the deployment, monitoring and management of applications and the infrastructure on which they run." According to Forrester Senior Analyst Glenn O'Donnell, who co-authored the report "Augment DevOps with NoOps," it is more likely that although some operations positions will become unnecessary, others will simply evolve from a technical orientation toward a more business-oriented focus.
The two main drivers behind NoOps are increasing IT automation and cloud computing. At its most extreme, a NoOps organization is one that has no operations employees at all, however various other systems can be referred to as "NoOps" as well. For example, Platform-as-a-Service (PaaS) vendors such as AppFog and Heroku describe their offerings as NoOps platforms.
NoOps can be contrasted with DevOps, a concept in which the line between development and operations teams is blurred and members of each group assume some of the responsibilities of the other team.

Tuesday, June 5, 2012

Noisy text 

Noisy text is an electronically-stored communication that cannot be categorized properly by a text miningsoftware program. In an electronic document, noisy text is characterized by a discrepancy between the letters and symbols in the HTML code and the author's intended meaning.
Noisy text does not comply with rules the program uses to identify and categorize words, phrases and clauses in a particular language. Idiomatic expressions, abbreviations, acronyms and business-specific lingo can all cause noisy text. It is particularly prevalent in the unstructured text found in blog posts, chat conversations, discussion threads and SMS text messages. Other potential causes include poor spelling and punctuation, typographical errors and poor translations from optical (OCR) and speech recognition programs.

Wednesday, May 30, 2012

Predictive Modelling


Predictive modeling is a process used in predictive analytics to create a statistical model of future behavior. Predictive analytics is the area of data mining concerned with forecasting probabilities and trends.
A predictive model is made up of a number of predictors, which are variable factors that are likely to influence future behavior or results. In marketing, for example, a customer's gender, age, and purchase history might predict the likelihood of a future sale.
In predictive modeling, data is collected for the relevant predictors, a statistical model is formulated, predictions are made and the model is validated (or revised) as additional data becomes available. The model may employ a simple linear equation or a complex neural network, mapped out by sophisticated software.
Predictive modeling is used widely in information technology (IT). In spam filtering systems, for example, predictive modeling is sometimes used to identify the probabilitythat a given message is spam. Other applications of predictive modeling include customer relationship management (CRM), capacity planningchange managementdisaster recoverysecurity management, engineering, meteorology and city planning.

Quality function deployment (QFD)

Quality function deployment (QFD) is the translation of user requirements and requests into product designs. The goal of QFD is to build a product that does exactly what the customer wants instead of delivering a product that emphasizes expertise the builder already has.
QFD was created by Japanese planning specialist Yoji Akao in 1966 as a way to help product planners look at new (or in-development products) through the lenses of customer, company and technology. QFD is achieved by linking the needs of the end user to subsystems or specific elements of the product creation process -- from design and development to engineering, manufacturing and services.
Visual representations of market needs are key components of QFD, and graphs and matrices are typically deployed to track the process. For instance, Six Sigma QFD requires the customer to document his needs and wants in his own words so that a "House of Quality" matrix can be built. The customer meets with the manufacturer to prioritize requirements so the manufacturer understands priorities and can translate them into engineering and business process requirements. The manufacturer then establishes design criteria to ensure the customer's requirements are met.

Tuesday, May 22, 2012


Digg


Digg is a social networking websitefeaturing user-submitted news stories. Digg features links from across theInternet, ranging from widely known news sources to obscure blogs. Digg also builds its own list of popular stories that are going viral across the Web. A
To use Digg, users submit stories, and the Digg community votes on which ones they like the best. Every story has a "Digg button," and as the story collects positive votes, the story is cross-pollinated across other channels.
Digg's most popular stories are located in the "Top News" section of the website. "Top News" can feature anything from serious news to fun content. Users can also customize their own news feeds by using "My News" interface, allowing users to select information based on the people they follow, stories they've already read and stories that are trending across the Digg community.
Digg is also organized into categories based on topics such as technology and business, and users can sort content by NewsImages and Videos. Users can also vote down stories that they don't like or don't consider relevant using the "Bury" button.
Digg was founded in the fall of 2004 and was launched that December. It gained fast popularity and was one of the 100 most trafficked sites on the internet. Since its redesign in 2010, Digg has seen a decline in users, as many complained that they preferred the old design.


MORE INFO:
Social media tips: Arm yourself for the social revolution

Friday, May 18, 2012

Data Center Infrastructure Management



Data Center Infrastructure Management (DCIM) is the convergence of IT and data center facilities functions within an organization. The goal is to improve the efficiency of energy use, optimize equipment layouts, support virtualization and consolidation, develop more strategic planning and enhance the overall availability of the data center. DCIM typically relies on software tools for real-time systems analysis, control and planning.


When properly implemented, DCIM can provide a holistic view of the facility from the rack or cabinet level to the cooling infrastructure to the building's energy utilization. DCIM tools can also help administrators locate and identify relationships between the facility and IT systems which may compromise data center resilience. DCIM tools can be used to measure energy use and facilitate energy conservation tactics that can reduce data center operating expenses. Some organizations will couple DCIM with computational fluid dynamic (CFD) analysis to optimize air flow and systems placement to further reduce cooling expenses.


Energy-monitoring sensors and supporting hardware must be installed along all points of the power infrastructure so the DCIM software can accurately aggregate and analyze power usage effectiveness (PUE) and cooling system energy efficiency. However, some vendors offer software-only DCIM tools that can integrate with existing monitoring hardware. Converged infrastructure platforms such as Cisco Systems' Unified Computing System (UCS) frequently include DCIM tools as part of the converged infrastructure package.