Wednesday 31 October 2018

AI and Machine Learning from GigaSpaces InsightEdge harnessed by Magic Software's SaaS integration platform

GigaSpaces, the provider of InsightEdge, an in-memory real-time analytics platform for instant insights to action, has announced that InsightEdge has been selected by Magic Software Enterprises to power their Magic xpi end-to-end integration platform.  

This integration will enable companies to make faster and smarter data-driven decisions to boost revenues, reduce costs, mitigate risks, and outperform competitors.

Providing the free flow of data between leading ERP, CRM, finance, MES and other systems, Magic xpi now leverages InsightEdge which unifies real-time analytics and AI to achieve lean manufacturing, perform predictive maintenance and automate operational workflows. Machine learning models run with sub-second latency on hot data as it’s born, while being enriched with historical context from data lakes, resulting in accurate real time actionable insights for improved decision making.

“InsightEdge enables our customers to generate insights for C-level executives and line of business management and optimizes automated processes from an unprecedented amount of data generated by sensors and GPS readings from machines, products, employees, as well as inputs from shop floor apps, and back office systems,” said Yuval Lavi, VP Product Innovation at Magic Software.  “The combination of integrated data from multiple sources, extreme data processing and real-time analytics provides insights that impact leaner operations and an improved customer experience.”

Magic xpi running with InsightEdge enables several data driven processes, for example, using sensors to monitor equipment to predict breakdowns, performing predictive analytics to determine which and how many quality tests should be performed, and sharing  supplier production data with partners and customers to identify delivery delays and adjust processes accordingly.  

McKinsey has noted that predictive maintenance initiatives can show a 10% reduction in annual maintenance costs and a 20% reduction in downtime with a 25% reduction in inspection costs for AI-driven predictive maintenance models.

This is not the first collaboration between Magic and GigaSpaces. Magic has been using GigaSpaces’ XAP In-memory computing platform for years to deliver fast data streaming, aggregation and calculations, and last year announced an InsightEdge integration leveraging InsightEdge as an IoT Hub.  Magic xpi customers will now have an option to run InsightEdge to experience the benefits of real time analytics.

“Data integration combined with real-time advanced analytics is needed to fuel the factory of the future,” said Yoav Einav, VP of Products for GigaSpaces. “With the incorporation of InsightEdge capabilities into Magic xpi, we are bringing the power of machine learning to the shop floor and the back office to help companies optimise processes to maximize efficiencies and exceed their revenue goals.”

GigaSpaces and Magic Software are presenting “The Insight-Driven Organization: Leverage AI to Transform Your Data into Revenue” on 6 November at the Design Offices in Munich Germany, starting at 8:30AM.  The event will include case studies and discussions about market challenges, trends and real-world best practices for enterprises to innovate with confidence and become insight-driven.

Go to Source

The post AI and Machine Learning from GigaSpaces InsightEdge harnessed by Magic Software's SaaS integration platform appeared first on Statii News.



source http://news.statii.co.uk/ai-and-machine-learning-from-gigaspaces-insightedge-harnessed-by-magic-softwares-saas-integration-platform/

Intoware and Librestream partner to expand their product offering

UK-based Intoware, creators of the WorkfloPlus software platform that converts existing paper-based and human work process into easy to follow, step-by-step digital work instructions, has announced a partnership agreement with Librestream Technologies Inc., the pioneer of the Remote Expert platform for industrial enterprises.

Despite the growing adoption of technologies by businesses that enhance productivity – often under the banner of ‘Industry 4.0’ – many industries continue to rely on work processes that create mountains of paper. According to research by PriceWaterhouseCoopers, companies are expected to invest $907 Billion per year by 2020 in digitizing their businesses.

As part of their agreement, Librestream will include Intoware’s WorkfloPlus work instruction software within its Onsight platform. Meanwhile, Intoware will add Librestream’s world-leading Onsight Connect software to its own product portfolio, launching as WorkfloPlus Remote Expert. 

WorkfloPlus Remote Expert software delivers a fully collaborative environment that allows teams to troubleshoot, assess, and rapidly resolve issues in the field by bringing in virtual experts via a live video link. Remote Expert works across a full range of mobile devices including smartphones, tablets, computers and smart glasses.

By using Remote Expert, and WorkfloPlus’ digital work instruction platform workers in the field can harness the knowledge and expertise of colleagues at the touch of a button. Workers can share audio and video, circling and marking areas that need attention, adjusting lighting, and recording or capturing still images to develop a lasting knowledge base. Meanwhile back at HQ, the wider team can rapidly diagnose, inspect, and resolve issues. The software is designed to perform in ultra-low bandwidth situations such as offshore installations, remote locations with poor cellular coverage, or even from basements and sites where it can be hard to connect.

“Librestream has shown itself as a real trailblazer with its Onsight platform. Collaborating with them to expand our reach into new territories and adding its Connect solution to our own WorkfloPlus suite is a significant step for Intoware, and should result in even better products for our customers,” said James Woodall, CTO and founder of Intoware.

“Our customers have asked for an integrated platform that includes access to remote guidance, digital work instructions, and augmented content management. We selected Intoware after an extensive market analysis and product review process. This capability is in high demand by our customers and we are actively working with them to add Onsight Flow to their digital ecosystems,” added Kerry Thacher, CEO of Librestream.

Go to Source

The post Intoware and Librestream partner to expand their product offering appeared first on Statii News.



source http://news.statii.co.uk/intoware-and-librestream-partner-to-expand-their-product-offering/

Layman’s Guide to Bone Marrow Transplant

Bone marrow or simply marrow is soft and spongy tissues that found inside the bones. It contains blood-forming cells called blood stem cells that create different blood cells including:

  • White Blood Cell or the WBC which forms the body’s defense system to fight against the infections.
  • Red Blood Cell or RBC which carries oxygen throughout the body like a supply chain system.
  • Platelet that controls the bleeding.

A bone marrow transplant is a branch of medical treatment that replaces the unhealthy marrow with that of the healthy marrow which is in short called BMT. You will be happy to know that the best BMT doctors in India have developed high quality and affordable treatment of bone marrow transplant among the Asian countries and now India has become the destination of the bone marrow transplant to them.

What is bone marrow transplant?

The BMT is a medical procedure that involves the replacement of damaged or destroyed marrow with healthy bone marrow stem cells. The transplant procedure necessitates extraction of healthy bone marrow stem cells and filters them to infuse back to the donor himself or to the recipient. Before transplanting the healthy bone marrow, chemotherapy or radiation or the both may be applied to ensure that the damaged marrows are killed. This is done in the following ways:

  • Ablative treatment — In this method, a high-dose of chemotherapy, radiation or both are applied to kill the cancer cells. In this process, the healthy bone marrows are also deliberately killed so that new stem cells can grow in the bone marrow.
  • Reduced intensity treatment — It is also called a mini-transplant where a low radiation and chemotherapy are applied before the transplant. This is generally done in the case of older people who have other health problems.

What are the types of bone marrow transplant?

Bone marrow transplants are of the following three types:

  • Autologous bone marrow transplant — When healthy stem cells are removed from a patient’s bone marrow before application of chemotherapy and radiation, the stem cells are stored in a freezer. The healthy stem cells are put back to the patient’s body after radiation or chemotherapy treatment to grow normal and healthy blood cells. As this process is done within the same patient, it is called autologous transplant.
  • Allogeneic bone marrow transplant — Within the word allogeneic the term ‘allo’ means other or different, so it’s self-explanatory. The stem cells are taken out from a different person, who is called a donor. A special test is undergone to observe a donor is a good match for a patient, at least the genes of donor match partly with the patient under the treatment. Mostly, a brother or sister may be a good match as a donor for the patient. Sometimes, parents, children, and other relatives are also may be a good match for the patient. Other than relatives of the patient, a donor may be searched from the national bone marrow registries to find out the best possible match.
  • Umbilical cord blood transplant — This is another type of allogeneic transplant. Stem cells, in this case, are taken out from the newborn baby’s umbilical cord just immediately after the birth. The stem cells are stored under frozen condition until they are required for a transplant. Since the stem cells present in the umbilical cord are immature, they need a less match and as the numbers of stem cells are much less, blood count needs much longer to recover.

The bone marrow transplant is usually made after chemotherapy and radiation. The best hospital for bone marrow transplant in India has the entire infrastructure for such treatment that acclaimed art of world-class procedure. They deliver the stem cells into the patient’s bloodstream through a central venous catheter. The mechanism is very much similar to that of a blood transfusion. The stem cells are transported through blood and finally reach the bone marrow and it does not require any surgery.

How the stem cells are collected?

The donor’s stem cells are collected in the following two ways:

  • Bone marrow harvest — It is a minimally invasive procedure performed under general anesthesia. The bone marrow is taken out from the back of both the hip bones. The amount of bone removed from the donor depends on the body weight of the recipient to whom the marrow will be donated. During the entire process, the donor will be asleep due to anesthesia.
  • Leukapheresis — A donor is given shots for several days so that the stem cells can move from bone marrow into blood. The blood is removed through an IV line from the donor under the leukapheresis procedure. The White blood cells that possess the stem cells are separated through a machine and removed and stored under frozen condition. Later the stem cells are given to the recipient.

Which diseases are treated under bone marrow transplantation?

A bone marrow transplant is made to treat the accompanying illness of:

  • Aplastic Anemia
  • Leukemia
  • Thalassemia
  • Lymphoma
  • Multiple Myeloma
  • Neuroblastoma

How does a patient prepare before transplantation?

Before performing the transplantation procedure different tests will be undergone according to the patient’s health condition and medical history. Before transplant one or two tube (s) called catheter will be inserted into the blood vessel of neck and arm to give fluid, nutrition, or to draw blood.

How does a patient prepare after transplantation?

Depending on the treatment either autologous or allogeneic transplant, the procedure is performed in a hospital as an outpatient and does not need to stay overnight in the hospital. But the duration of stay depends on:

  • How much amount of chemotherapy or radiation was given to the patient?
  • The type of transplant is performed.
  • The hospital’s treatment protocol.
  • Close monitoring of the patient’s blood count and vital signs.
  • Completion of anti-infection treatment including antibiotics, antifungals, and antiviral medicines.
  • Until a patient can eat food through the mouth without any side effect.

What is the cost of bone marrow transplant treatment in India?

The bone marrow transplant cost in India varies from 15 lakh to 30 lakh depending on the type of transplant, the age of a patient, general health of the patient.

Go to Source

The post Layman’s Guide to Bone Marrow Transplant appeared first on Statii News.



source http://news.statii.co.uk/laymans-guide-to-bone-marrow-transplant/

ERP Has Reinvented Itself Again, And This Time It’s Going Designer

ERP Designer

ERP Designer

ERP Designer- As early as the ’90s, businesses and analysts alike have foretold the death of the ERP system. Over 20 years later, however, ERP is still alive and well. Globalization, digitalization, the internet and a whole host of other technologies have made it virtually impossible for businesses to move away from ERP. There’s simply too much critical data housed in the underlying databases for elimination to ever be a viable option.

ERP was criticized for its complexity and for being difficult to change without expensive and time-consuming coding — and still is. Customization is almost always a necessity and can mean locking into a current version or vendor because those customizations are impossible to maintain.

However, this whole dynamic has changed in a relatively short period of time thanks to cloud technology advances and the burgeoning low-code/no-code market. Citizen developers are taking advantage of visual coding environments and machine-generated code to easily and quickly create high-performance applications that are custom-built to fit a business’ exact needs. While these low-code/no-code environments have been quickly adopted in consumer technology and app development, adoption has been slower in the enterprise space — and for good reason. Skeptics worry that we’ve been down this road before and that the rise of the “citizen developer” will lead to Shadow IT all over again: ungoverned, unmanaged solutions built outside of IT’s domain.

I believe the citizen developer market will be huge in the coming years, but this growth should be driven by the citizen developer – not big software vendors like us. The true power of low-code/no-code development comes from the industry-specific and tailored functionality that can be added by smaller, specified app development platforms and their citizen users, which can be far more tailored to a single micro-industry than the software of an industry giant ever could. Even our company, which offers ERP solutions for many industries, can’t compete with the enterprise-specific apps citizen developers may create. And those apps will only be augmented by interfacing with and extending the capabilities of ERP platforms that already exist (no matter which ERP vendor they come from).

To prepare for and be ready to fully take advantage of the benefits low-code/no-code can offer, organizations should put in place an integrated, “all hands on deck” approach, where apps built by citizen developers are governed and managed by IT, leading to a more successful digital transformation. While there will undoubtedly still be bumps in the road ahead, it’s time to set aside reservations and explore the possibilities of low-code/no-code — but not without a solid plan in place.

Read More Here

Article Credit: Forbes

Go to Source

The post ERP Has Reinvented Itself Again, And This Time It’s Going Designer appeared first on Statii News.



source http://news.statii.co.uk/erp-has-reinvented-itself-again-and-this-time-its-going-designer/

You Can Thank Us Later – 5 Reasons Small Manufacturers May Need Cloud ERP

Cloud ERP

Cloud ERP

Cloud ERP- In the face of growing competition and a challenging business climate, many job shops and manufacturers are looking for ways to reduce costs, streamline operations and improve the bottom line. Implementing Enterprise Resource Planning (ERP) software is a proven solution for running a business efficiently and effectively. Yet, a large percentage of job shops and manufacturers have held back from implementing an ERP system due to high initial costs, long implementation times, and competing demands for time and resources.

Cloud ERP, sometimes referred to as Software-as-a-Service, or SaaS, delivers financial, implementation, and operational benefits to job shops and manufacturers.

As job shops and manufacturers improve operational efficiency, and increase their competitive standing, most realize that an ERP system can help streamline their business processes and improve operational efficiency. However, recent estimates suggest that the lack of ERP is holding back companies. This is especially true in smaller companies. High initial costs – including software, hardware, and supporting infrastructure – lengthy and complex implementation projects, and the staffing necessary to implement and maintain ERP systems all become barriers to ERP implementation.

With traditional on premises software deployments, customers purchase, install, manage, and maintain the software as well as supporting infrastructure, such as hardware and networks, in house. In a cloud deployment, the software vendor hosts, manages, and provides customers access to the software as a service over the Internet. Rather than pay for the software up front out of their capital budgets, cloud customers license it on a subscription basis, usually per user, per month, or per a specified number of transactions. On-going maintenance, upgrades, and support for the software and infrastructure are all the responsibility of the software vendor and are included within the subscription fee. ERP systems that are cloud deployed can also drive a significant reduction in the total cost of ownership, compared with legacy on premises deployments. The cloud model offers a wide range of financial and operational benefits for manufacturers which may include lower and more predictable ongoing costs, faster implementations and time-to-value, reduced cost of ownership, greater reliability and availability and reduced IT complexity.

Read More Here

Article Credit: Small Business Trends

Go to Source

The post You Can Thank Us Later – 5 Reasons Small Manufacturers May Need Cloud ERP appeared first on Statii News.



source http://news.statii.co.uk/you-can-thank-us-later-5-reasons-small-manufacturers-may-need-cloud-erp/

Plugging the functionality gaps of ERP systems

 ERP functionality

ERP functionality

ERP functionality- Enterprise resource planning (ERP) systems are an indispensable tool in an organisation’s financial strategy by providing a highly transparent way to collect, manage, track, and analyse enterprise-wide business data. ERP systems are crucial for managing numerous accounting processes, ensuring compliance, and reporting to stakeholders. But they come up short in one important respect—automated account reconciliations that ensure a timely and accurate financial close.

Although ERP systems effortlessly attend to the “nuts and bolts” of accounting, verifying the journal entries, subledger tie-outs, and other complex transactional information, they don’t specifically validate this data for the financial close. This task falls to finance and accounting, which typically address the need manually, often using complicated, multi-line-item spreadsheets.

In the post-Sarbanes-Oxley Act (SOX) era, the importance of an accurate and fully validated close can’t be understated. For example, ERP systems are great at verifying if the accounts payables (AP) subledger agrees with the AP general ledger (GL) balance or the inventory subledger agrees with the inventory GL balance.

Since the ERP system can’t complete the “last mile” of the financial close process, many accountants must still step back to the last century and crunch numbers in a paper-intensive process. And since humans are imperfect beings, these manual processes often produce errors—such as keying in the wrong balance. Spreadsheets are cumbersome documents, so they create the risk of version control and data integrity issues. Tracking the workflows across a global business via streams of e-mails and printed documents is a nightmare. Worst of all, the specific goal of the manual processes—validating the accuracy of the ERP-verified subledger data—is often impossible to achieve because of inefficiency and inaccuracy.

It doesn’t have to be this way. Technology is available to pick up where ERP systems leave off.

Resting Easy

All finance and accounting organisations seek assurance and comfort that their account reconciliation and financial close processes are accurate for both business performance and regulatory compliance reasons. More and more organisations are switching to an accounting solution to handle their financials; hence, there must be a good amount of beneficial reasons why. Handling business’ finances with clumps of expenses, boxes of receipts, and spreadsheets of income and expenses expose information to potential lapses and deductions. However, this can be resolved with the right accounting solution which provides an up-to-date, computerised accounting file which is crucial especially during the tax season.

Accounting plays a key role in the functioning of any business. With global financial crisis in the recent past and a number of businesses expanding on a daily basis, the presence of a strong accounting system is the need for any business. Many accounting and bookkeeping firms are embracing the trends in accounting in order to reshape their business and simplify their work to a great extent.

Many companies have built homegrown software systems that provide greater visibility and control around the reconciliation process. But these systems involve a substantial amount of up-front and ongoing work and resources. They often fall short of third-party financial close suites, which have more functionality and integrate smoothly with ERP systems.

Read More Here

Article Credit: NA

Go to Source

The post Plugging the functionality gaps of ERP systems appeared first on Statii News.



source http://news.statii.co.uk/plugging-the-functionality-gaps-of-erp-systems/

Why today’s ERP should stand for Earn, Rest and Play

ERP should stand for Earn

ERP should stand for Earn

ERP should stand for Earn- Iron Man has his powered armoured suit, Captain America his shield and Wonder Woman her bracelets of submission. When we are young, its tools like these we think will make us ‘super’ people. But as we become adults and come to grips with reality, our definition of what it takes to be ‘super’ changes completely.

Especially in business, we have taken the term ‘super’ to describe leaders who have managed to find the right balance between work and life. The modern superhero we see lauded in the media is someone who is successful in their field while also making enough time for family and hobbies outside of work. That might mean getting home in time to put the kids to bed, finding time to train for a marathon, or just putting in the hours to learn how to play an instrument.

We are in awe of these individuals not because of a tremendous feat of strength or heroic act, but because achieving a perfect work/life balance while running a successful business is one of the hardest things to do as an adult.

The job is doubly difficult for CFOs, who are being challenged to navigate a complex economic landscape and be the CEO’s right-hand. It takes more than just superhero thinking and experience to achieve this, it takes super-powered technologies capable of supporting decision-making at the highest level quickly and accurately.

Today, it is their ERP cloud application that delivers the real power CFOs need. ERP may stand for “Enterprise Resource Planning”, but it delivers so much more when done right. Here’s why ERP today should stand for “Earn, Rest and Play”, taking the headache of administration out of running a finance organisation for CFOs so they can find the right work/life balance and prove their true worth as modern superheroes.

Go to Source

The post Why today’s ERP should stand for Earn, Rest and Play appeared first on Statii News.



source http://news.statii.co.uk/why-todays-erp-should-stand-for-earn-rest-and-play/

Tuesday 30 October 2018

How The IoT Will Reshape The City Experience

IoT Experience

IoT Experience

IoT Experience- What cities will look like tomorrow greatly depends on what today’s urban leaders, planners, businesses—and of course, residents—are looking for today.

Understanding tech for tomorrow’s city begins with grasping today’s outstanding challenges. A Juniper Research report written in conjunction with Intel contends that creating a more intelligent city begins with self-awareness in areas from air pollution and traffic congestion to overcrowding and pockets of inequality. “Smart cities are those that recognize these challenges and adopt their planning and strategy to address them,” the report states.

Here are three areas where rapid advancements in IoT technology are already beginning to shape the cities of tomorrow, and how residents benefit from (and participate in) those substantial improvements.

1. Computer vision: With computer vision, cameras and other visual sensors capture raw video as data and process it into useful, actionable information. In smart cities, improved computer vision will give rise to enhanced quality of services, improved public safety, reduced congestion, and new levels of efficiency. One example of computer vision in action involves smart streetlights. Outfitted with this technology, they can reduce brightness when they detect no people or vehicles present (saving energy); monitor nearby parking to enable drivers to quickly find the nearest vacant space; or monitor pedestrian and vehicle traffic flows to optimize traffic and crosswalk signals. In terms of public safety, computer vision-enabled streetlights can send alerts on dangerous potholes or blocked storm drains.

2. Edge computing. An easy way to understand edge computing is to think of the phrase “where the action is.” Rather than relay information back to a central hub, cloud or data center mainframe, edge computing processes and analyzes data right at the source of where it’s collected.

So instead of a device or sensor sending its data over the internet, it can process this data itself—essentially becoming its own mini data center. And edge computing is strongly on the rise for one big reason: An IoT data deluge results when adding more devices to a smart cities network.

Smart buildings represent a prominent example of edge computing in action. As people occupy or crowd one part of a building, for example, sensors pick up this activity and via edge computing, can adjust lighting and climate control to optimize your comfort and visibility—while using far less energy on unoccupied floors. The result: smarter green buildings that react to our daily usage.

Edge computing (along with the emergence of powerful 5G data networks), will also play a major role in enabling driverless cars. Vehicles such as Google’s Waymo produce 1 GB of data per second. And sending that kind of data somewhere else for processing poses all sorts of problems, the biggest of which is latency—that is, a delay between when the data is generated in real time, and when it is processed. And when it comes to a car driving itself, even a few seconds of latency is, if you will, a non-starter.

Read More Here

Article Credit: Forbes

Go to Source

The post How The IoT Will Reshape The City Experience appeared first on Statii News.



source http://news.statii.co.uk/how-the-iot-will-reshape-the-city-experience/

How To Protect Healthcare IoT Devices In A Zero Trust World

Protect IoT Devices

Protect IoT Devices

Protect IoT Devices- Over 100M healthcare IoT devices are installed worldwide today, growing to 161M by 2020, attaining a Compound Annual Growth Rate (CAGR) of 17.2% in just three years according to Statista.

  • Healthcare executives say privacy concerns (59%), legacy system integration (55%) and security concerns (54%) are the top three barriers holding back Internet of Things (IoT) adoption in healthcare organizations today according to the Accenture 2017 Internet of Health Things Survey.
  • The global IoT market is projected to soar from $249B in 2018 to $457B in 2020, attaining a Compound Annual Growth Rate (CAGR) of 22.4% in just three years according to Statista.

Healthcare and medical device manufacturers are in a race to see who can create the smartest and most-connected IoT devices first. Capitalizing on the rich real-time data monitoring streams these devices can provide, many see the opportunity to break free of product sales and move into more lucrative digital service business models. According to Capgemini’s “Digital Engineering, The new growth engine for discrete manufacturers,” the global market for smart, connected products is projected to be worth $519B to $685B by 2020. The study can be downloaded here (PDF, 40 pp., no opt-in). 47% of a typical manufacturer’s product portfolio by 2020 will be comprised of smart, connected products. In the gold rush to new digital services, data security needs to be a primary design goal that protects the patients these machines are designed to serve. The following graphic from the study shows how organizations producing smart, connected products are making use of the data generated today.

SOURCE: CAPGEMINI DIGITAL ENGINEERING, THE NEW GROWTH ENGINE FOR DISCRETE MANUFACTURERS STUDY

Healthcare IoT Device Data Doesn’t Belong For Sale On The Dark Web

Every healthcare IoT device from insulin pumps and diagnostic equipment to Remote Patient Monitoring is a potential attack surface for cyber adversaries to exploit. And the healthcare industry is renowned for having the majority of system breaches initiated by insiders. 58% of healthcare systems breach attempts involve inside actors, which makes this the leading industry for insider threats today according to Verizon’s 2018 Protected Health Information Data Breach Report (PHIDBR).

Many employees working for medical providers are paid modest salaries and often have to regularly work hours of overtime to make ends meet. Stealing and selling medical records is one of the ways those facing financial challenges look to make side money quickly and discreetly. And with a market on the Dark Web willing to pay up to $1,000 or more for the most detailed healthcare data, according to Experian, medical employees have an always-on, 24/7 marketplace to sell stolen data. 18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey. Healthcare IoT devices are a potential treasure trove to inside and outside actors who are after financial gains by hacking the IoT connections to smart, connected devices and the networks they are installed on to exfiltrate valuable medical data.

Read More Here

Article Credit: Forbes

Go to Source

The post How To Protect Healthcare IoT Devices In A Zero Trust World appeared first on Statii News.



source http://news.statii.co.uk/how-to-protect-healthcare-iot-devices-in-a-zero-trust-world/

What’s the IoT doing to your data center?

Monday 29 October 2018

How Do We Address The Reproducibility Crisis In Artificial Intelligence?

Artificial Intelligence Crisis

Artificial Intelligence Crisis

Artificial Intelligence Crisis- Major breakthroughs in AI have seen machines being entrusted with business and safety-critical decisions, from guiding vehicles to diagnosing diseases. Yet a reproducibility crisis is creating a cloud of uncertainty over the entire field, eroding the confidence on which the AI economy depends.

Reproducibility, the extent to which an experiment can be repeated with the same results, is the basis of quality assurance in science because it enables past findings to be independently verified, building a trustworthy foundation for future discoveries. This is crucial because previous breakthroughs are the barometer by which to measure all subsequent progress.

Without the capacity to reproduce past results, the entire basis on which machines are increasingly making legal, corporate and even medical decisions is called into question. This could stop us from being able to benefit from some of the greatest advances in the field, from the AIs that power smart cities to those that find new drug treatments.

For example, deep reinforcement learning (RL), whereby machines try every possible solution until they find the right one, could enable driverless cars to endlessly crisscross in virtual reality until they learn to safely change lanes in the real world. Yet experts found that RL results are not easily reproducible, raising questions over whether it can be relied on to ensure road safety. An analysis of 30 AI papers similarly found that the majority of them were difficult to reproduce because key records of their methodologies were missing, from training data sets to study parameters.

As a result, Google researcher Ali Rahimi has likened AI to alchemy. The way in which alchemy produced new innovations such as glass alongside false cures such as leeches is directly analogous to the way that AI has discovered potential cancer cures yet failed to distinguish masks from faces.

Lack Of Traceability

The fundamental problem is that data science is not governed by the same generally accepted standards of quality assurance as other fields of science. As a result, the data trail charting the road from the origins of AI to its latest iterations is shrouded in mystery.

There are currently no universal standards governing the data capture, curation and processing techniques that give vital meaning and context to AI experiments. This is the equivalent of climate scientists investigating global warming without any rules on how to document the locations or units of temperature readings.

This is particularly concerning as there are so many iterations involved in developing machine-learning tools and there is no universal benchmark of good practice for implementing and recording them all. A single experiment to create a facial-recognition system involves a complex layer cake of processes, from training runs to software updates, file changes and tweaks to the algorithm. If any of these steps is not meticulously recorded, it would be painstakingly difficult to modify the AI or reproduce the original results.

Read More Here

Article Credit: Forbes

Go to Source

The post How Do We Address The Reproducibility Crisis In Artificial Intelligence? appeared first on Statii News.



source http://news.statii.co.uk/how-do-we-address-the-reproducibility-crisis-in-artificial-intelligence/

A.I. Songwriting Has Arrived. Don’t Panic

A.I. Songwriting

A.I. Songwriting

A.I. Songwriting- “IT’S CHEATING.” That’s the response you’ll hear from self-proclaimed music purists talking about technological innovation in song creation. Sampling, synthesizers, drum machines, Auto-Tune—all have been derided as lazy ways to make chart-topping hits because they take away the human element. (With apologies to Vanilla Ice, Gary Numan, Prince, and T-Pain.)

The new argument among fans and musicians will be about the use of artificial intelligence in songwriting. According to several estimates, in the next decade, between 20% and 30% of the top 40 singles will be written partially or totally with machine-learning software. Today, recording pros can use A.I.-powered programs to cue an array of instrumentation (from full orchestral arrangements to hip-hop beats), then alter it by mood, tempo, or genre (from heavy metal to bluegrass).

“It’s like the future of self-driving cars,” says Leonard Brody, entrepreneur and cofounder of Creative Labs, a joint venture with Creative Artists Agency that invests in programs to help audio creators get their works delivered to the public. “Level 1 is an artist using a machine to assist them. Level 2 is where the music is crafted by a machine but performed by a human. Level 3 is where the whole thing is machines.”

A.I. claiming ownership of a third of the top 40 may be surprising to the casual listener, but it’s a low bar for Drew Silverstein, CEO of Amper, an A.I.-based music composition software company in New York City. Amper’s product allows musicians to create and download “stems”—unique portions of a track like a guitar riff or a hi-hat cymbal pattern—and rework them. Silverstein sees predictive tools as an evolution in the process of music creation. “Starting from quill and parchment centuries ago, then moving into analog and tape and mobile [devices]—A.I. is really just the next step,” he says.

Silverstein isn’t the only one with that view. Large technology companies also offer A.I.- powered tools and services for music-making. Among them: IBMWatson Beat, Google Magenta’s NSynth, Sony’s Flow Machines, and Spotify’s Creator Technology Research Lab. The resources, intended for use by artists and labels, use algorithms to analyze libraries of songs and sales charts to predict what may have the best chance of charting (and when).

Though the latest developments in A.I. are helping fuel its use in popular music, it’s not really a new idea. More than two decades ago, David Bowie helped create the Verbasizer, a program for Apple’s Mac that randomized portions of his inputted text sentences to create new ones with new meanings and moods—an advanced version of a cut-up technique he used, writing out ideas, then physically slicing and rearranging them to see what stuck. Bowie made use of the Verbasizer for his 1995 album Outside. “What you end up with is a real kaleidoscope of meanings and topic and nouns and verbs all sort of slamming into each other,” said the influential pop star in a 1997 documentary featuring the tool.

Read More Here

Article Credit: Fortune

Go to Source

The post A.I. Songwriting Has Arrived. Don’t Panic appeared first on Statii News.



source http://news.statii.co.uk/a-i-songwriting-has-arrived-dont-panic/

AI Won’t Replace Us Until It Becomes Much More Like Us

Artificial Intelligence needs structural changes before it can really match up with humans.

AI Won't Replace Us

AI Won’t Replace Us

AI Won’t Replace Us- The late Stephen Hawkins worried that AI could end mankind. It seemed reasonable. Elon Musk warned machines that learned to operate without a human telling them what to do could “destroy humanity as a matter of course without even thinking about it” if it “[had] a goal and humanity just happens [to be] in the way.”.

But reality has proven that while AI can beat humans at games, it still fails at common tasks an infant can do, such as holding an object. In fact, to solve this problem, researchers from OpenAI used 6144 CPUs and 8 GPUs to collect about one hundred years of experience and trained the AI for 50 hours. As a result, the robotic hand can handle unknown objects — as long as they are “within reason.”

The fundamental gap

As Antonio Bicchi, a professor of robotics at the Istituto Italiano di Tecnologia said, the research had a number of limitations, such as the hand is always facing up so that the objects always fall in the palm.

We can’t tell for certain if another 100 years of training data would make the AI even better, or if it needs a new set of training data. What we can say is that humans are exceptionally good at incremental learning. Once a human learns to play with a ball, they can master any ball game quite easily. Or when we learn a foreign language, learning other languages becomes easier every time.

But an AI must learn everything from scratch. AI can’t use “other AIs.” It always starts from zero. AI cannot be “combined” with other AIs to do more complex tasks — at least not yet. So, while AI masters skills at a superhuman level, it only masters one task.

The missing piece

Recent developments in AI were boosted with the invention of deep learning algorithms and improved computer powers, which seemed to mimic how the brain operates by simulating perceptrons. But the brain is much more than that.

We don’t know how the real brain works and, according to Sam Rodriques, we’ll never know until we drill holes in the human skull and plant probes to study how it really works “behind the scenes” (or bones). But what we do know is that the brain is much less rational than we used to think. Studies prove that we decide first, then we try to find reasons for why we decided the way we did.

In fact, people with brain damage who were incapable of developing emotions could perfectly describe what they should be doing in logical terms, yet they found it very difficult to make even simple decisions, such as what to eat. Our choices are arguably always based on emotion.

Yet, there is no AI system that works like this. AI can’t reason, which has led to hidden AI bias in many projects. Of course, there is work in progress to solve these cases, but AI was primarily designed as a black box since we did not know how to code it in the first place.

In addition, just throwing more power in and building bigger machines is not the way if the machine takes the wrong path, and it can be expensive to know where it is really going since we can’t know what it has learned.

Read More Here

Article Credit: Entrepreneur

Go to Source

The post AI Won’t Replace Us Until It Becomes Much More Like Us appeared first on Statii News.



source http://news.statii.co.uk/ai-wont-replace-us-until-it-becomes-much-more-like-us/

AI RESEARCHERS FIGHT OVER FOUR LETTERS: NIPS

AI NIPS

AI NIPS

AI NIPS- THE FUTURE OF humanity will be shaped by artificial intelligence. Now some of the best brains working on the technology are riven by a debate about a four-letter acronym that some say contributes to the field’s well-documented diversity problems.

NIPS is the name of AI’s most prominent conference, a venue for machine learning research formally known as the Annual Conference on Neural Information Processing Systems. Researchers at tech companies including Google and leading universities allege that the name contributes to an atmosphere unwelcoming to women. The acronym has long inspired anatomical jokes about nipples; others dislike the word’s racist connotations.

“It encourages juvenile behavior and it’s a distraction from the science we’re trying to do,” says Anima Anandkumar, a Caltech professor and director of research at chipmaker Nvidia. Late Wednesday she tweeted a link to an online petition asking the NIPS board to rename the event. There are now more than 800 signatories, including researchers at Amazon, Microsoft, and Google.

They include Jeff Dean, who as Google’s top AI boss leads one of the world’s largest and most influential AI research groups, dubbed Google Brain. Dean told WIRED Thursday that he plans to raise his concerns with members of the NIPS board when he gets the chance. “I think enough people are made to feel uncomfortable by the current name that the NIPS board should change the name,” he tweeted Wednesday.

University of Washington grad student Maarten Sap’s attempt to follow the brouhaha helped illustrate the complaints from Dean and others: When he plugged “nips” into Twitter’s search function, it led him to pornographic tweets.

AI is projected to reshape everything from health care to war, but the community of people working on the technology is markedly different from the society it is supposed to serve. WIRED and startup Element AI found that in recent years at NIPS and two other leading academic conferences, only about 12 percent of people presenting work were women. That suggests the field is even less diverse than the notoriously monocultural tech industry. Some researchers fear this raises the risk of incidents like those in which image recognition systems have been found to have skewed views of women or black people.

The name NIPS became a flashpoint in the debate over how to make AI more inclusive earlier this year after complaints about sexist behavior at the conference, which takes place each December. A blog post about an on-stage remark about sexual harassment at last year’s conference led to investigations into accusations of physical harassment by Google’s director of statistics research Steven Scott, and University of Minnesota professor Brad Carlin. Both subsequently left their their jobs.

That incident helped inspire an open letter in March from 112 Johns Hopkins University faculty asking the NIPS board to seek a name less “vulnerable to sexual puns.” In April the board said it would consider alternatives, and in August it surveyed attendees from the past five years on crowdsourced options including SNIPS or ICOLS.

Read More Here

Article Credit: Wired

Go to Source

The post AI RESEARCHERS FIGHT OVER FOUR LETTERS: NIPS appeared first on Statii News.



source http://news.statii.co.uk/ai-researchers-fight-over-four-letters-nips/

Sunday 28 October 2018

Oracle CEO Mark Hurd Says His Customers Will Move to the Cloud. Eventually.

Oracle co-Chief Executive Mark Hurd views it as a marathon, one in which Oracle customers are slowly migrating to the cloud. “This is not a switch flip,” Hurd told Barron’s in San Francisco on Wednesday. “This will take months, quarters, years, to ripple through” Oracle’s business.

Hurd thinks Oracle just gave customers a compelling reason to make the switch. This week at its annual developers conference, Oracle announced enhanced security to its cloud technology that includes “autonomous robots” that “find and kill” malicious attacks from bot-equipped hackers, he said.

He believes it’s a game-changer—less than 1% of all Oracle databases are encrypted—in a tech marketplace dominated by news of security snafus atFacebook (FB), Alphabet ’s Google (GOOGL), and elsewhere.

“You can make the argument that this is the biggest technical release in our [41-year] history,” Hurd said. “There is no question about its impact on helping cloud sales.”

Still, despite a rally that briefly lifted shares this summer, Oracle stock has slumped amid questions about its cloud business. Oracle stock is down 9% since Oct. 1.

Oracle reported $6.61 billion in fiscal first-quarter revenue for cloud services and license support in September, which fell short of analysts’ estimates of $6.68 billion. Oracle also came up shy in overall revenue, at $9.19 billion, missing the $9.24 billion forecast by analysts polled by FactSet. The company has missed revenue estimates seven of the past 13 quarters.

Deepening concerns, the putative head of Oracle’s cloud product strategy, Thomas Kurian, said he isn’t coming back after the company initially said in early September that he was taking “extended time off” from the company. Kurian reportedly clashed with Oracle co-founder, Chairman and Chief Technology Officer Larry Ellison over cloud strategy. Oracle declined to comment on Kurian’s departure.

Adding to the tension, Ellison has boasted at previous OracleWorlds that Oracle was coming after the juggernaut Amazon Web Services, as Oracle shifts from a traditional model of licensing and maintenance to subscription-based cloud computing. In February, Oracle said it would quadruple its number of giant data-center complexes over the next two years.

But the case has been just the opposite the past few years, based on market share of the $38.8 billion worldwide market for DBMS, defined as business software for data management deployed onsite or in the computing cloud.

While Oracle remains the market leader, its market share eroded to 37% in 2017 from 43.6% in 2013, according to numbers compiled by market researcher Gartner. Amazon’s slice, meanwhile, has rocketed to 9.3% from 0.9% in the same time frame, and Microsoft’s has improved to 21.7% from 18%.

“Hurd talked about the slow rate of adoption in the cloud as if it were a feature and not a bug,” said Gartner analyst Merv Adrian, who is attending OracleWorld and had a briefing with Hurd. “He put a brave face on things. Hurd is saying some large customers are moving cautiously in shifting operations to the cloud.”

Read More Here

Article Credit: Barrons’s

Go to Source

The post Oracle CEO Mark Hurd Says His Customers Will Move to the Cloud. Eventually. appeared first on Statii News.



source http://news.statii.co.uk/oracle-ceo-mark-hurd-says-his-customers-will-move-to-the-cloud-eventually/

Microsoft Earnings: Mark Your Calendar

Can these key catalysts keep up their momentum?

Microsoft Calendar

Microsoft Calendar

Microsoft Calendar- Software giant Microsoft (NASDAQ:MSFT) has continued to impress investors in 2018 amid its ongoing progress becoming a more cloud-centric company. The company recently wrapped up its fiscal 2018 with a 14% year-over-year increase in revenue and an impressive 21% boost to operating income. Shares have surged 28% year to date.

In less than two weeks, investors will get to see whether the company has been able to keep up its strong momentum into fiscal 2019. Microsoft reports its first-quarter earnings for fiscal 2019 on October 24. Ahead of Microsoft’s earnings release, here’s an overview of some of the key areas investors will want to watch.

Commercial-cloud revenue

Core to Microsoft’s momentum recently is the company’s rapidly growing commercial-cloud revenue. As a revenue category that lumps together some of Microsoft’s biggest commercial-cloud products — including Office 365 commercial, Azure, and Dynamics 365 — Microsoft’s commercial-cloud revenue has been a major catalyst for the company.

In Microsoft’s most recent quarter, commercial-cloud revenue increased 53% year over year, to $6.9 billion, accounting for 23% of the quarter’s total revenue.

For Microsoft’s first quarter of fiscal 2019, investors should look for more year-over-year growth for the segment of around 50%. Microsoft CFO Amy Hood indicated in the company’s fiscal fourth-quarter earnings call that the segment continues to fire on all cylinders, saying: “Customer commitment to our cloud platform continues to increase. In FY18, we closed a record number of multi-million dollar commercial cloud agreements and more than doubled the number of $10 million-plus Azure agreements.”

Azure

Cloud-computing service Azure, which falls under Microsoft’s commercial-cloud revenue categorization, also will be worth taking a close look at. Not only is Azure Microsoft’s biggest contributor to the company’s commercial-cloud revenue, but it’s growing at a mind-boggling rate. In Microsoft’s fiscal fourth quarter, Azure revenue surged 89% year over year, or 85% year over year in constant currency.

Given that this was a slight deceleration compared to the 93% revenue growth (89% growth in constant currency) that Azure saw in the third quarter of fiscal 2018, investors shouldn’t be surprised to see some more modest deceleration in fiscal Q1. Of course, investors should look for Azure to grow at a high year-over-year growth rate of around 80%.

LinkedIn

Investors also should look for an update on LinkedIn, which Microsoft acquired in 2016. The social network for professionals has been an excellent performer for the software giant. Fiscal fourth-quarter revenue from LinkedIn was up 37% year over year. Even more impressive, this was the segment’s fifth quarter in a row of accelerating revenue growth.

Read More Here

Article Credit: The Motley Fool

Go to Source

The post Microsoft Earnings: Mark Your Calendar appeared first on Statii News.



source http://news.statii.co.uk/microsoft-earnings-mark-your-calendar/

Building An Analytics-Centric Organization

Analytics Organization

Analytics Organization

Analytics Organization- When I left the MIT Sloan School of Management in 1993, little did I know it at the time, but I was armed with the knowledge and mindset of a paradigm that would become known as Big Data. For the record, I owe tremendous credit to professors at MIT such as Steve Graves and Erik Brynjolfsson whose insights and perspectives left an indelible mark on my thinking. Although I could not fully articulate all of the details of what Big Data meant at the time, I did have an unwavering conviction that the lifeblood of a high-velocity organization would eventually be based on real-time analytics of and execution on mass quantities of data. And, to take it one step further, Big Data would revolutionize how companies operated and would drive the creation of whole new businesses (and perhaps even industries).

When I left MIT in 1993 for Intel, anyone could clearly see that computational power was advancing at a staggering rate and that the World Wide Web along with the broader infrastructure of the Internet were going to play a central role in this revolution. I knew that whatever “it” was – was going to be very big. But in all honesty, in my wildest dreams, I could never have predicted how big this digital transformation would become and how it would continue to evolve.

One afternoon in the mid-90’s, I wrote down a vision of an operating model for a corporation that would employ vast amounts of data and automated algorithms to enable the company to autonomously optimize and run its operations in real-time: integrating across operational domains that encompassed day-to-day tactical decision-making, operational planning and management and the long-term strategic decision-making. At the essence of this vision was an organization whose very existence and operational ethos was built on data and analytics. This is about as close as I ever got to a crystal ball. Imagine my delight and amazement when a few short years later, I had the opportunity to join a small, but growing company (Amazon.com), led by a visionary leader (Jeff Bezos) that shared this vision almost to a tee! I had found myself in the business and career analog of the “kid in the candy store”! Candidly, I thought that life could never be better – until I found myself a decade ladder in an even bigger role at Google.

During my entire professional career, I have built analytics functions and groups that operated at the core of any operation that I’ve managed. Fortunately, along the way, I have had a disproportionate share of success, but have made some mistakes as well. I’d like to share some of those key learnings along the way. This article will be mostly retrospective in nature – I’ll save the predictions for a future article.

Read More Here

Article Credit: Forbes

Go to Source

The post Building An Analytics-Centric Organization appeared first on Statii News.



source http://news.statii.co.uk/building-an-analytics-centric-organization/

Saturday 27 October 2018

What happens to your bitcoin after you die?

 Bitcoin die

Bitcoin die

Bitcoin die- No one knows how long cryptocurrencies will last, but it’s a decent bet they might outlast you. Passing your digital holdings on to loved ones after your death isn’t as simple as bequeathing cash or other property, though, particularly since wills aren’t designed for confidential information.

Because a private key is all that’s necessary to transfer funds from a wallet, including it in your will might be a terrible idea. “I would strongly advise against anyone putting any information they consider private into their will,” says estate planning attorney Gordon Fischer. “Wills, after your death, become court documents and are generally public documents, accessible by anyone.”

private key is like an unchangeable password, which is generated when you create a new cryptocurrency wallet. It should always be kept as safe and secure as possible.

Although a will might not enter public records immediately, it’s unwise to risk exposing the keys to your crypto wallets at all. Your family might not recognize the significance of a private key right away, and by the time they do, your digital wealth could be pilfered by crafty crooks or other unsavory characters. Fischer notes that trusts, however, are “generally private documents.”

With conventional assets, there’s an established procedure for claiming them through probate court, but with cryptocurrencies the process is less certain. Complicating matters is that many cryptocurrency exchanges don’t let their customers name beneficiaries. Coinbase, the largest trading platform, puts the burden on the heirs to claim any assets left by the deceased (so, hopefully your family knows which exchange you use). Another major bourse, Gemini, declined to comment for this story.

Of course, the problem isn’t confined to the cryptocurrency world: Popular stock trading app Robinhood, which recently began offering crypto trading, doesn’t offer basic beneficiary support. Sadly, as a result, proving that heirs are entitled to crypto inheritance could quickly become a protracted legal nightmare.

Read More Here

Article Credit: Quartz

Go to Source

The post What happens to your bitcoin after you die? appeared first on Statii News.



source http://news.statii.co.uk/what-happens-to-your-bitcoin-after-you-die/

Anybody Want Bitcoin Futures? Anybody?

The contracts that were seen as a step toward bringing crypto to Wall Street remain a tiny market.

Bitcoin Future

Bitcoin Future

Bitcoin Future-  On a Sunday evening last December, as the cryptocurrency craze consumed the world, traders waited eagerly at their computers to witness the debut of a flashy new financial product.

Bitcoin futures were set to start trading on Cboe Global Markets Inc.’s exchange at 5 p.m. Chicago time. Futures allow an investor to place bets on the price something will hit at a later date without having to buy the asset itself. Industry enthusiasts hoped the contracts would help bring Bitcoin trading into the financial mainstream, ushering in big investors with bulging pocketbooks.

In the months leading up to the debut, Bitcoin’s price went ballistic. It surged around 600 percent between when Cboe revealed its plans in early August and CME Group Inc. started trading its own version in mid-December, which coincided with Bitcoin’s record high of about $20,000. It has since lost more than half its value.

It’s not surprising that the cryptocurrency peaked within hours of the kickoff of CME’s contracts, according to Michael Unetich, vice president of cryptocurrencies at Chicago-based Trading Technologies International Inc. Some people were expecting tens of thousands of contracts per day to trade, he says, “and the market just wasn’t ready for that to happen.”

Ten months later, some of the hopes for Bitcoin futures look more like pipe dreams. Cboe and CME combined traded about 9,000 contracts a day in the third quarter. “It has not been what you would call a roaring success,” says Craig Pirrong, a finance professor at the University of Houston and an expert on futures trading. “Institutional players have stayed on the Bitcoin sidelines, and as long as they are, the futures contracts are likely not to generate substantial amounts of volume.”

The average of about 5,000 daily contracts at CME in the third quarter is up from around 3,500 in the prior quarter. Still, by comparison, CME traded more than 18 million contracts daily in the second quarter on products tied to everything from oil and gold to interest rates and the S&P 500. “We’re not seeing huge flows” for Bitcoin contracts, CME Chief Executive Officer Terry Duffy told Bloomberg Television in July.

Read More Here

Article Credit: Bloomberg

Go to Source

The post Anybody Want Bitcoin Futures? Anybody? appeared first on Statii News.



source http://news.statii.co.uk/anybody-want-bitcoin-futures-anybody/