Blog

Thursday, 29 September 2016 05:11

5 New Features in PHP 7

Written by

SPEED!:- The developers worked very hard to refactor the PHP codebase in order to reduce memory consumption and increase performance. And they certainly succeeded. Benchmarks for PHP 7 consistently show speeds twice as fast as PHP 5.6 and many times even faster! Although these results are not guaranteed for your project, the benchmarks were tested against major projects, Drupal and WordPress, so these numbers don’t come from abstract performance tests.

Type Declarations:-

Type declarations simply means specifying which type of variable is being set instead of allowing PHP to set this automatically. PHP is considered to be a weak typed language. In essence, this means that PHP does not require you to declare data types. Variables still have data types associated with them but you can do radical things like adding a string to an integer without resulting in an error. Type declarations can help you define what should occur so that you get the expected results. This can also make your code easier to read. We’ll look at some specific examples shortly. Since PHP 5, you can use type hinting to specify the expected data type of an argument in a function declaration, but only in the declaration. When you call the function, PHP will check whether or not the arguments are of the specified type. If not, the run-time will raise an error and execution will be halted. Besides only being used in function declarations, we were also limited to basically 2 types. A class name or an array.  

Error Handling:-

The next feature we going to cover are the changes to Error Handling. Handling fatal errors in the past has been next to impossible in PHP. A fatal error would notinvoke the error handler and would simply stop your script. On a production server, this usually means showing a blank white screen, which confuses the user and causes your credibility to drop. It can also cause issues with resources that were never closed properly and are still in use or even locked.

In PHP 7, an exception will be thrown when a fatal and recoverable error occurs, rather than just stopping the script. Fatal errors still exist for certain conditions, such as running out of memory, and still behave as before by immediately stopping the script. An uncaught exception will also continue to be a fatal error in PHP 7. This means if an exception thrown from an error that was fatal in PHP 5 goes uncaught, it will still be a fatal error in PHP 7.

I want to point out that other types of errors such as warnings and notices remain unchanged in PHP 7. Only fatal and recoverable errors throw exceptions.

In PHP 7, Error and Exception both implement the new Throwable class. What that means is that they basically work the same way. And also, you can now use Throwable in try/catch blocks to catch both Exception and Error objects. Remember that it is better practice to catch more specific exception classes and handle each accordingly. However, some situations warrant catching any exception (such as for logging or framework error handling). In PHP 7, these catch-all blocks should catch Throwable instead of Exception.  

New Operators

Spaceship Operator PHP 7 also brings us some new operators. The first one we’re going to explore is the spaceship operator. With a name like that, who doesn’t want to use it? The spaceship operator, or Combined Comparison Operator, is a nice addition to the language, complementing the greater-than and less-than operators. Spaceship Operator < = >

Easy User-land CSPRNG

What is Easy User-land CSPRNG?

User-land refers to an application space that is external to the kernel and is protected by privilege separation, API for an easy to use and reliableCryptographically Secure PseudoRandom Number Generator in PHP. Essentially secure way of generating random data. There are random number generators in PHP, rand() for instance, but none of the options in version 5 are very secure. In PHP 7, they put together a system interface to the operating system’srandom number generator. Because we can now use the operating system’s random number generator, if that gets hacked we have bigger problems. It probably means your entire system is compromised and there is a flaw in the operating system itself. Secure random numbers are especially useful when generating random passwords or password salt.  

Conclusion

There are quite a few other features added in PHP 7, like unicode support for emoji and international characters

example:- echo "\u{1F60D}"; // output :- 

Source:-http://blog.teamtreehouse.com/5-new-features-php-7

1 INTRODUCTION

Feature-oriented software development (FOSD) is a paradigm for the construction, customization, and synthesis of large-scale software systems. The concept of a feature is at the heart of FOSD. A feature is a unit of functionality of a software system that satisfies a requirement, represents a design decision, and provides a potential configuration option. The basic idea of FOSD is to decompose a software system in terms of the features it provides. The goal of the decomposition is to construct well-structured software that can be tailored to the needs of the user and the application scenario. Typically, from a set of features, many different software systems can be generated that share common features and differ in other features. The set of software systems generated from a set of features is also called a software product line [45, 101].

FOSD aims essentially at three properties: structure, reuse, and variation. Developers use the concept of a feature to structure design and code of a software system, features are the primary units of reuse in FOSD, and the variants of a software system vary in the features they provide. FOSD shares goals with other software development paradigms, such as stepwise and incremental software development [128, 98, 99, 105], aspect-oriented software development [58], component-based software engineering [65, 118], and alternative flavors of software product line engineering [45, 101], whose differences to FOSD we will discuss.

Historically, FOSD has emerged from different lines of research in programming languages, software architecture, and modeling. So it is not surprising that current developments in FOSD comprise concepts, methods, language, tools, formalisms, and theories from many different fields. An aim of this article is to provide a historical overview of FOSD as well as a survey of current developments that have emerged from the different lines of FOSD research. Due to the sheer volume and diversity of work on FOSD and related fields, we cannot hope for completeness. Instead, we provide a personal view on recent interesting developments in FOSD. Also we do not aim at a comprehensive discussion of individual approaches but at highlighting connections between different lines of FOSD research and identifying open issues.

2 CONCEPTS AND TERMINOLOGY

2.1 What is a Feature?

FOSD is not a single development method or technique, but a conglomeration of different ideas, methods, tools, languages, formalisms, and theories. What connects all these developments is the concept of a feature.

2.2 What is Feature-Oriented Software Development?

The concept of a feature is useful for the description of commonalities and variabilities in the analysis, design, and implementation of software systems. FOSD is a paradigm that favors the systematic application of the feature concept in all phases of the software life cycle. Features are used as first-class entities to analyze, design, implement, customize, debug, or evolve a software system. That is, features not only emerge from the structure and behavior of a software system, e.g., in the form of the software’s observable behavior, but are also used explicitly and systematically to define variabilites and commonalities, to facilitate reuse, and to structure software along these variabilites and commonalities. A distinguishing property of FOSD is that it aims at a clean (ideally one-to-one) mapping between the representations of features across all phases of the software life cycle. That is, features specified during the analysis phase can be traced through design and implementation.

The idea of FOSD was not proposed as such in the first place but emerged from the different uses of features. A main goal of this survey is to convey the idea of FOSD as a general development paradigm. The essence of FOSD can be summarized as follows: on the basis of the feature concept, FOSD facilitates the structure, reuse, and variation of software in a systematic and uniform way.

Source:- http://www.cs.cmu.edu/

Like it or not, every single product you use on a daily basis is slowly getting smart. First it was your cellphone, then it was your TV, then your lights, thermostat, appliances, and now even the mood-enhancing candles around your home—wait, there’s a smart candle now too?!

The LuDela is another product in a long list of smart devices that consumers really didn’t ask for. But given there are entire stores dedicated to selling candles, there are undoubtedly enough interested parties out there to help make this thing a success.

Using a technology its creators call Wi-Fire, which we’re just going to assume is regular old Wi-fi with a cute name, every LuDela candle in your home can be connected and controlled from a smartphone, not unlike Philips’ Hue smart lightning system. But instead of simulating a candle using something like a flickering LED, the LuDela uses actual fire, for genuine ambiance. Inside the larger candle shell you’ll find smaller wax refills that burn away with a real flame that can be remotely ignited and extinguished.

With a simple screen tap inside the free accompanying app, users can activate all or some of the LuDela candles in their home without getting off the couch, or climbing out of the bathtub. Each one runs on a battery so there are no wires to deal with, and since power is only needed to momentarily ignite a wick, the candle’s can go for a long time before the app lets you know which ones need a charge.

Besides ease of use, and not constantly having to hunt for a lighter, the LuDela candles might also appeal to those who are uneasy about having an open flame in their homes. Proximity detectors can be activated to automatically extinguish the flame when people, or pets, get too close to the candles, and the same thing will happen if the candles ever accidentally get knocked over while lit. The LuDela base also ensures that hot candle wax never drips onto your furniture, or poses a burn risk.

Pricing might be an issue for some candle fans, though. If you’re used to buying a giant bag of tea candles from Ikea for a few bucks then you might balk at the LuDela’s $99 price tag, which doesn’t include the refills when it ships early next year. However, a discounted refill monthly subscription will be made available that includes scented options to match the seasons, if you’re all about pumpkin spice when September rolls around.

Source:-http://gizmodo.com/the-worlds-first-smart-candle-can-be-lit-and-extinguish-1786637893

This era keeps their gadgets close to the heart. People of this era believe in technology like never before and have embraced it as a major part of life.

Every aspect of our life is becoming influenced by technology. It has helped the world come closer and thus it is no more a problem to communicate wherever you might be. Your loved ones are just a call away.

We have come closer as people even though we have a lot of differences. Technology has provided us a platform so that we can express our views and thoughts and communicate with a larger, even unknown audience.

Another major advantage that technology has been able to provide us is the ability to store a lot of data that is important to us in any way possible. We have hard drives built in to store all the important data that is required for our use. Offices use high capacity hard drives to store the huge amounts of data that is required for their maintenance and everyday use. Hard drives are used in computers and laptops to store personal information, images, movies and important documents. We keep all this data stored so that it is easy to find and retrieve them when required. The hard drive can be managed in form of folders where we can divide data according to various categories. Offices manage huge chunks of data and use them for analysis. Some of the data is used for maintenance purpose while a lot of data is used for analysis purpose.

Hard drives are either internal or external. Internal hard drives are embedded in the machines while external ones can be used as and when required and can be attached to multiple devices.

As a lot of important data is stored in the hard drives, it is necessary that we keep them safe. But there issues with electronic devices that we need to accept. There are unfortunate events such as hard drive crashes and that is when we are in a soup. The different events that lead to data loss are:

  • Drive Formatted.
  • Broken USB /Flash stick.
  • Deleted Data/Files.
  • Unreadable DVD/CD.
  • Loss of phone data.

There are many solutions to retrieve the data that was there in the hard drive. There are many Professional Hard Drive Recovery services in Perth that provide solutions to hard drive data recovery very efficiently and effectively so that you can get back all the important data. You can search on the internet for the various services that provide such solutions and sign up with one of them to avail their services. Go through what they are willing to offer and do check the customer reviews before deciding which service to sign up with.

The services are generally equipped with very advanced formulae and tools so that they can keep your device and data safe and for Hard drive data recovery. In this age of technology and development , cybercrimes are a great issue too. So it is necessary that the data is kept securely so that no one can snoop into your data. Look into all these major features before choosing the service you would want to take up to keep your data stored in a secure manner.

Source:-http://www.articlesfactory.com/articles/computers/effective-hard-drive-data-recovery-is-absolute-necessities-in-todays-world.html

Saturday, 17 September 2016 04:48

Introduction to the Windows Command Prompt

Written by

Introduction

Before Windows was created, the most common operating system that ran on IBM PC compatibles was DOS. DOS stands for Disk Operating System and was what you would use if you had started your computer much like you do today with Windows. The difference was that DOS was not a graphical operating system but rather purely textual. That meant in order to run programs or manipulate the operating system you had to manually type in commands. When Windows was first created it was actually a graphical user interface that was created in order to make using the DOS operating system easier for a novice user. As time went on and newer versions of Windows were developed DOS was finally phased out with Windows ME. Though the newer operating systems do not run on DOS, they do have something called the command prompt, which has a similar appearance to DOS. In this tutorial we will cover the basic commands and usage of the command prompt so that you feel comfortable in using this resource.

Using the Command Prompt or Dos Window

When people refer to the command prompt they may we refer to it in different ways. They may refer to it as a shell, console window, a command prompt, a cmd prompt, or even dos. In order to enter the command prompt you need to run a program that is dependent on your operating system. Below we list the programs that you need to run to enter a command prompt based on the version of Windows you are running.

Operating System Command Notes
Windows 3.1,.3.11, 95, 98, ME  command.com This program when run will open up a command prompt window providing a DOS shell.
 Windows NT, 2000, XP, 2003  cmd.exe  This program will provide the native command prompt. What we call the command prompt.
 Windows NT, 2000, XP, 2003  command.com This program will open up a emulated DOS shell for backwards compatibility. Only use if you must.  

 

To run these programs and start a command prompt you would do the following steps:

Step 1: Click on the Start Menu

Step 2: Click on the Run option

Step 3: Type the appropriate command in the Open: field. For example if we are using Windows XP we would typecmd.exe.

Step 4: Click on the OK button

After following these steps you will be presented with a window that look similar to Figure 1 below.

Figure 1. Windows Command Prompt

The command prompt is simply a window that by default displays the current directory, or in windows term a folder, that you are in and has a blinking cursor ready for you to type your commands. For example in Figure 1 above you can see that it says C:\WINDOWS>. The C:\WINDOWS> is the prompt and it tells me that I am currently in the c:\windows directory. If I was in the directory c:\program files\directory the prompt would instead look like this: C:\PROGRAM FILES\DIRECTORY>.

To use the command prompt you would type in the commands and instructions you want and then press enter. In the next section we will discuss some useful commands and how to see all available built in commands for the command prompt.

Useful commands

The command.com or cmd.exe programs have built in commands that are very useful. Below I have outlined some of the more important commands and further instruction on how to find information on all the available commands.

The Help command - This command will list all the commands built into the command prompt. If you would like further information about a particular command you can type help commandname. For example help cd will give you more detailed information on a command. For all commands you can also type the command name followed by a /? to see help on the command. For example, cd /?

The Exit command - This command will close the command prompt. Simply type exit and press enter and the command prompt will close.

The CD command - This command allows you to change your current directory or see what directory you are currently in. To use the CD command you would type cd directoryname and press enter. This would then change the directory you are currently in to the one specified. When using the cd command you must remember how paths work in Windows. A path to a file is always the root directory, which is symbolized by the \ symbol, followed by the directories underneath it. For example the file notepad.exe which is located in c:\windows\system32 would have a path as follows \windows\system32\notepad.exe. If you want to change to a directory that is currently in your current directory you do not need the full path, but can just type cd directoryname and press enter. For example if you are in a directory called c:\test, and there were three directories in that the test directory called A, B, and C, you could just type cd a and press enter. You would then be in the c:\test\a. If on the other hand you wanted to change your directory to the c:\windows\system32 directory, you would have to type cd \windows\system and press enter.

The DIR command - This command will list the files and directories contained in your current directory, if used without an argument, or the directory you specify as an argument. To use the command you would just type dir and press enter and you will see a listing of the current files in the directory you are in, including information about their file sizes, date and time they were last written to. The command will also show how much space the files in the directory are using and the total amount of free disk space available on the current hard drive. If I typed dir \test I would see the contents of the c:\test directory as shown in Figure 2 below.

Figure 2. DIR of c:\test

If you examine the screen above you will see a listing of the directory. The first 2 columns are the date and time of the last write to that file. Followed by whether or not the particular entry is a directory or a file, then the size of the file, and finally the name of the file. You may have noticed that there are two directories named . and .., which have special meaning in operating systems. The . stands for the current directory and the .. stands for the previous directory in the path. In the example above, .. stands for c:\windows.

dir *.txt will only list those files that end with .txt.

The Copy command - This command allows you to copy files from one location to another. To use this command you would type  copy filetocopy copiedfile. For example if you have the file c:\test\test.txt and would like to copy it to c:\windows\test.txt you would type

copy c:\test\test.txt c:\windows\test.txt and press enter. If the copy is successful it will tell you so and give you back the prompt. If you are copying within the same directory you do not have to use the path. Here are some examples and what they would do:

copy test.txt test.bak :- Copies the test.txt file to a new file called test.bak in the same directory

copy test.txt \windows:- Copies the test.txt file to the \windows directory.

copy * \windows:- Copies all the files in the current directory to the \windows directory.

The Move command - This command allows you to move a file from one location to another. Examples are below:

move test.txt test.bak:- Moves the test.txt file to a new file renaming it to test.bak in the same directory.

move test.txt \windows:- Moves the test.txt file to the \windows directory.

move * \windows :-Moves all the files in the current directory to the \windows directory.

At this point you should use the help command to learn about the other available commands.

Redirectors

Redirectors are an important part to using the command prompt as they allow you to manipulate how the output or input of a program is displayed or used. Redirectors are used by appending them to the end of a command followed by what you are redirecting to. For example: dir > dir.txt. There are four redirectors that are used in a command prompt and they are discussed below:

> This redirector will take the output of a program and store it in a file. If the file exists, it will be overwritten. If it does not exist it will create a new file.

For example the command dir >

>dir.txt will take the output of the dir command and place it in the dir.txt file. If dir.txt exists, it will overwrite it, otherwise it will create it.  This redirector will take the output of a program and store it in a file. If the file exists, the data will be appended to the current data in the file rather than overwriting it. If it does not exist it will

>>create a new file. For example the command dir >> dir.txt will take the output of the dir command and appends it to the existing data in the dir.txt file if the file exists. If dir.txt does not exist, it will create the file first.

< This redirector will take the input for a program from a specified file. For example the date command expects input from a user. So if we had the command date < date.txt, it would take the input for the date program from the information contained in the date.txt file. | This redirector is called a pipe. It will take the output of a program and pipe it into another program. For example dir | sort would take the output of the dir command and use it as input to the sort command.

Batch Files

Batch files are files that have an extension ending in .bat. They are simply scripts that contain command prompt commands that will be executed in the order they are listed. To create a batch file, just make a file that ends in .bat, such as test.bat, and inside the file have the commands you would like. Each command should be on its own line and in the order you would like them to execute.

Below is example batch file. It has no real use but will give you an example of how a batch files works. This test batch file contains the following lines of text:<

cd< cd \test dir cd \ If I was to run the test.bat file I created I would have output that looks like the following:

Figure 3: Example of a batch file running. As you can see from the figure above, my batch file executed each command in my batch file in the sequence they were written in the batch file.

Console Programs

If a program is created for express purpose of running within a command prompt, or console window, that program is called a console program. These are programs that are not graphical and can only be run properly from within a command prompt window.

http://www.bleepingcomputer.com/tutorials/windows-command-prompt-introduction/ 

Wednesday, 14 September 2016 05:10

ERP - enterprise resource planning

Written by

ERP is short for enterprise resource planning. Enterprise resource planning (ERP) is business process management software that allows an organization to use a system of integrated applications to manage the business and automate many back office functions related to technology, services and human resources. ERP software integrates all facets of an operation — including product planning, development, manufacturing, sales and marketing — in a single database, application and user interface.

ERP software is considered an enterprise application as it is designed to be used by larger businesses and often requires dedicated teams to customize and analyze the data and to handle upgrades and deployment. In contrast,Small business ERP applications are lightweight business management software solutions, often customized for the business industry you work in.

 

ERP Software

ERP software typically consists of multiple enterprise software modules that are individually purchased, based on what best meets the specific needs and technical capabilities of the organization. Each ERP module is focused on one area of business processes, such as product development or marketing. A business can use ERP software to manage back-office activities and tasks including the following:

Distribution process management, supply chain management, services knowledge base, configure, prices, improve accuracy of financial data, facilitate better project planning, automate employee life-cycle, standardize critical business procedures, reduce redundant tasks, assess business needs, accounting and financial applications, lower purchasing costs, manage human resources and payroll.

Some of the most common ERP modules include those for product planning, material purchasing, inventory control, distribution, accounting, marketing, finance and HR.

As the ERP methodology has become more popular, software applications have emerged to help business managers implement ERP in to other business activities and may incorporate modules for CRM and business intelligence, presenting it as a single unified package.

The basic goal of using an enterprise resource planning system is to provide one central repository for all information that is shared by all the various ERP facets to improve the flow of data across the organization.

Top 4 ERP Trends

The ERP field can be slow to change, but the last couple of years have unleashed forces which are fundamentally shifting the entire area. The following new and continuing trends affect enterprise ERP software:

Mobile ERP

Executives and employees want real-time access to information, regardless of where they are. It is expected that businesses will embrace mobile ERP for the reports, dashboards and to conduct key business processes.

Cloud ERP

The cloud has been advancing steadily into the enterprise for some time, but many ERP users have been reluctant to place data cloud. Those reservations have gradually been evaporating, however, as the advantages of the cloud become apparent.

Social ERP

There has been much hype around social media and how important – or not -- it is to add to ERP systems. Certainly, vendors have been quick to seize the initiative, adding social media packages to their ERP systems with much fanfare. But some wonder if there is really much gain to be had by integrating social media with ERP.

Two-tier ERP

Enterprises once attempted to build an all-encompassing ERP system to take care of every aspect of organizational systems. But some expensive failures have gradually brought about a change in strategy – adopting two tiers of ERP.

ERP Vendors

Depending on your organization's size and needs there are a number of enterprise resource planning software

vendors to choose from in the large enterprise, mid-market and the small business ERP market.

  • Large Enterprise ERP (ERP Tier I): The ERP market for large enterprises is dominated by three companies: SAP, Oracle and Microsoft. (Source: EnterpriseAppsToday; Enterprise ERP Buyer's Guide: SAP, Oracle and Microsoft; Drew Robb)
  • Mid Market ERP (ERP Tier II): For the midmarket vendors include Infor, QAD, Lawson, Epicor, Sage and IFS. (Source: EnterpriseAppsToday; Midmarket ERP Buyer's Guide; Drew Robb)
  • Small Business ERP (ERP Tier III): Exact Globe, Syspro, NetSuite, Visibility, Consona, CDC Software and Activant Solutions round out the ERP vendors for small businesses. (Source: EnterpriseAppsToday; ERP Buyer's Guide for Small Businesses; Drew Robb)

source:-http://www.webopedia.com/TERM/E/ERP.html

As companies use data and digital technology to improve more and more of their products, sales channels, and internal operations, IT teams have to support their company in different ways.

In essence, they’ve had to move from being tech experts – the ones that foretold a technology future and helped line managers make sense of it – to letting line managers take the lead. This is because, as it becomes an essential leadership competency for all managers to be able to get the most from the technology and information at their disposal, line managers often know more about the technology in their field than anyone in IT will do. The key is to help them integrate that technology into the rest of the firm’s infrastructure, and get the most from it.

One of the most common ways that CIOs are now doing this is to provide IT support through an “end-to-end IT services” or a product management model. This is where IT packages all technologies, processes, and people to support business outcomes.

 

Three Things to Keep in Mind

And, just as digitization represents a massive shift in how companies operate, implementing end-to-end IT services is a significant change for IT teams. Introducing an end-to-end model should follow a phased and iterative methodology, and all IT teams should keep three things in mind early-on in their transition.

1. Get support for the model early: Implementing end-to-end IT services changes how IT interacts with the business, with focus shifting away from technologies to business outcomes and capabilities. Not only that, it also drastically changes how IT approaches its activities. Given the extent of these changes, expect resistance from those in IT, and over-invest in efforts to gain support and buy-in. Start by identifying everyone in IT and the business who will be affected by the change. Take time up-front to clearly articulate the value and benefits of the end-to-end IT services model, and what changes they can expect in their day-to-day work.

2. Start small, learn, and then expand: Operationalizing end-to-end IT services means a steep learning curve for the IT function. While it is tempting to roll out many services simultaneously, IT teams will struggle to adjust to a new way of working and thinking. Instead, start by identifying areas that will benefit most from services. This allows both IT and the business to adjust to the new operating model and keeps the workload more manageable for service managers.

3. Know when good enough is good enough: Ultimately end-to-end IT services are about changing IT to support the business better. Focusing on finding the perfect metrics from the start is counterproductive. The same is true for unit costs and cost models. Spending time determining unit costs delays service roll out and may even erode some of the support for the end-to-end IT service model.

In the short-term, use metrics that help demonstrate the value of the new model. An iterative service methodology enables service managers to launch a service and refine and expand the metrics overtime. Once a service is live, service managers can work with service consumers to determine the appropriate metrics and cost units.

Source:- https://www.cebglobal.com/blogs/corporate-it-3-keys-to-success-in-implementing-end-to-end-it-services/?business_line=information-technology

Friday, 09 September 2016 07:02

The Importance of Algorithms

Written by

Introduction

The first step towards an understanding of why the study and knowledge of algorithms are so important is to define exactly what we mean by an algorithm. According to the popular algorithms textbook Introduction to Algorithms (Second Edition by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, Clifford Stein), "an algorithm is any well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values as output." In other words, algorithms are like road maps for accomplishing a given, well-defined task. So, a chunk of code that calculates the terms of the Fibonacci sequence is an implementation of a particular algorithm. Even a simple function for adding two numbers is an algorithm in a sense, albeit a simple one.

Some algorithms, like those that compute the Fibonacci sequences, are intuitive and may be innately embedded into our logical thinking and problem solving skills. However, for most of us, complex algorithms are best studied so we can use them as building blocks for more efficient logical problem solving in the future. In fact, you may be surprised to learn just how many complex algorithms people use every day when they check their e-mail or listen to music on their computers. This article will introduce some basic ideas related to the analysis of algorithms, and then put these into practice with a few examples illustrating why it is important to know about algorithms.

Runtime Analysis

One of the most important aspects of an algorithm is how fast it is. It is often easy to come up with an algorithm to solve a problem, but if the algorithm is too slow, it’s back to the drawing board. Since the exact speed of an algorithm depends on where the algorithm is run, as well as the exact details of its implementation, computer scientists typically talk about the runtime relative to the size of the input. For example, if the input consists of N integers, an algorithm might have a runtime proportional to N2, represented as O(N2). This means that if you were to run an implementation of the algorithm on your computer with an input of size N, it would take C*N2 seconds, where C is some constant that doesn’t change with the size of the input.

However, the execution time of many complex algorithms can vary due to factors other than the size of the input. For example, a sorting algorithm may run much faster when given a set of integers that are already sorted than it would when given the same set of integers in a random order. As a result, you often hear people talk about the worst-case runtime, or the average-case runtime. The worst-case runtime is how long it would take for the algorithm to run if it were given the most insidious of all possible inputs. The average-case runtime is the average of how long it would take the algorithm to run if it were given all possible inputs. Of the two, the worst-case is often easier to reason about, and therefore is more frequently used as a benchmark for a given algorithm. The process of determining the worst-case and average-case runtimes for a given algorithm can be tricky, since it is usually impossible to run an algorithm on all possible inputs. There are many good online resources that can help you in estimating these values.

 

Approximate completion time for algorithms, N = 100


O(Log(N))


10-7 seconds


O(N)

 

10-6 seconds

O(N*Log(N))

 

10-5 seconds

O(N2)

 

10-4 seconds

O(N6)

 

3 Minutes

O(2N)

 

1014 years.

O(N!)

.

10142 years

Conclusion

The different algorithms that people study are as varied as the problems that they solve. However, chances are good that the problem you are trying to solve is similar to another problem in some respects. By developing a good understanding of a large range of algorithms, you will be able to choose the right one for a problem and apply it properly. Furthermore, solving problems like those found in TopCoder’s competitions will help you to hone your skills in this respect. Many of the problems, though they may not seem realistic, require the same set of algorithmic knowledge that comes up every day in the real world.

source:-https://www.topcoder.com/community/data-science/data-science-tutorials/the-importance-of-algorithms/

Wednesday, 07 September 2016 05:19

Google Teaches kids to code with Project Blocks

Written by

(Image Credit: Google)

Google has launched a new open hardware platform called Project Bloks for developers, researchers, and designers to build physical coding-based experiences for teaching kids how to program.

The project highlights that kids play and learn using their hands; which makes an open hardware platform to learn an increasingly essential skill like coding a perfect medium in which to do so. On the project's website, Google claims it wants "to enable kids to develop computational thinking from a young age through coding experiences that are playful, tactile, and collaborative."

A reference kit has been created by design firm IDEO to show how Project Bloks hardware can be created. The goal, however, is to provide a platform that others can use to build their own devices. At the moment, Google's team for Project Bloks says it won't be releasing its own retail units.

Some of the hardware which Google examples as can be released through Project Bloks includes;

  • Sensor Lab - This kit would allow you to experiment with sensors and map an input to an output, like switching on a light if the temperature dropped.
  • Music Maker - With the Music Maker you could compose a track using computational thinking by inputting different instruments, layering and looping sounds, and then playing it through a wireless speaker.
  • Coding Kit - With this kit you could put physical code together to send instructions to toys around you — like controlling a robot to create some art.

At the base of Project Blocks is a Raspberry Pi Zero-based "Brain Board" which functions as the CPU to provide the power for the whole system. This central board interacts with "Pucks" which represent the physical programming language of Project Blocks.

Each of the Pucks do not have any active electronic components and could have interactive elements such as a dial, or even be just a piece of paper with some conductive ink. The individual pucks represent the basic commands such as "turn on/off", "move left", "play sound," etc.

Project Bloks was brought to life by team lead Jayme Goldstein and tech lead Joao Wilbert in collaboration with the Google Research and Education teams, IDEO, and Paulo Blikstein, the Director of the Transformative Learning Technologies Lab at Stanford University.

Google is looking for educators, researchers, developers and parents who would like to participate in its research studies later this year.

Source:- http://www.developer-tech.com/news/2016/jun/27/google-teaches-kids-code-project-bloks/

Saturday, 03 September 2016 05:11

Search Engine Optimization (SEO) Services

Written by

There are a variety of search engine optimization services which offer solutions for a variety of ranking issues, and deficiencies. Depending on your goals, and needs one, or a combination, of the below services may be right for your website.

Website SEO Audit

A search engine optimization audit can come in a varying levels of detail and complexity. A simple website audit can be as short as a few pages long, and would address glaring on-page issues such as missing titles, and lack of content. On the other hand other end of the spectrum, a comprehensive website SEO audit will be comprised of dozens of pages (for most larger sites it will be over one hundred pages) and address even the tiniest of website elements which may potentially be detrimental to the ranking-ability of a website.

On-Page SEO

On-page or on-site search engine optimization refers to SEO techniques which are designed to implement the problems and potential issues that an SEO audit uncovers. This is something which should always be part of all good SEO packages. On-page SEO addresses a variety of fundamental elements (as they relate to SEO) such as page titles, headings, content and content organization, and internal link structure.

As with a website SEO audit, there are basic, as well as comprehensive services when it comes to on-page search engine optimization. At the most basic level, an on-page optimization campaign can be a one-time project which includes recommendations developed through an audit, and the implementation thereof. This type of on-page optimization would generally target the home page and a few other important pages on the website. More comprehensive on-page search engine optimization campaigns will use the findings of a highly detailed website SEO audit, and monitor results to guide ongoing changes to the on-page optimization.

Link Development (Link Building)

Link development is one of the most controversial and often talked (written) about topics of the search engine optimization industry. Since backlinks are the most vital component of any search engine optimization campaign, and at the same time the most time consuming and consequently most expensive (assuming they are good quality links and not just random directory submissions and blog comment spam) part, inevitably, there are many service providers who offer inexpensive link building services in order to attract and impress potential clients. Such schemes include large volumes of directory submissions (e.g., 200 directory submissions per month), worthless blog and forum comment spam (e.g., 100 blog links per month), or article writing and submissions which result in extremely poor quality content published on equally low-quality article directories which contribute in no positive way to ranking improvements. So if someone is quoting you a $500 per month search engine optimization services which includes large volumes of directory submissions, blog posts, articles, blog/forum comments and so on, all you will be doing is throwing your money away. This is not to say that you can't get link-work for $500 per month; however, it won't be for a large volume of links.

Good quality link development work focuses on quality rather than quantity. A well-researched and relevant, good quality link is worth many times more than hundreds of free directory submissions.

The fundamentals of link building are, have always been and always will be, based on good quality (i.e., useful, interesting, entertaining, educational) content. Because if there is no good content on your site that people can link to, it will be very difficult to convince them to do so.

SEO Content Writing

SEO content writing is somewhat of a misnomer--it really should be replaced with high quality and well researched content writing. The term "SEO content writing" implies that there is a secret writing formula which turns plain everyday text into something magical that gets the attention of the search engines--this could not be further from the truth.

If you are looking for content writing services which will help your website attain higher rankings, what you are really looking for is high quality and well written content, and not SEO content. SEO content is what you would get from a writing sweatshop or someone who cannot afford to write good content because they are only charging you $12 per "article".

SEO content writing as a service can be useful, if shortcuts are not taken, and the content is not expected to perform magic. Well written, interesting and useful content will inevitably be found, and get attention on its own merits; however, it also helps lay the foundation for a successful link development campaign.

Code Optimization Code optimization is a service you can expect at the highest levels of search engine optimization services, as it involves an overhaul of your website HTML. The optimization of your HTML can impact search engine rankings in two ways. First, it can help alleviate code-clutter, and present your content in an easy-to-understand (for machines, that is, search engine algorithms) format. Second, it can help reduce the load-time of your website pages, so that search engine spiders don't have to wait around while your page loads (because it's too long, or has too many images, etc.)

A comprehensive search engine optimization campaign will have all of the above elements, but it will also incorporate other important services such as keyword research, ranking reports, traffic reports, and conversion tracking.

source:-http://www.whatisseo.com/seo-services.html

About Manomaya

Manomaya is a Total IT Solutions Provider. Manomaya Software Services is a leading software development company in India providing offshore Software Development Services and Solutions

From the Blog

05 July 2018
29 June 2018

Map

Manomaya Software Services, 1st Floor, KLE Tech Park, BVB Engg. College Campus, Vidyanagar,