Voice as a User Interface?
In the world of information management most are concerned with one important thing... how to store information so users can find it!! Most companies are getting good at this, not great, but good... and they are getting better each year. Unfortunately, they are dealing with even more information than every before:
Microsoft, Google, Amazon and Apple have all demonstrated that people can interface with their services using only their Voice. Of course, us humans are only coming around to fully adopting this as part of our daily lives.... but that is because there are still some barriers. Eventually, those barriers will be removed by some innovation so why not plan for the future and start looking at a Voice UI for Employees to help them get the information they need to do their job better.
Creating a Voice UI for your Enterprise?
Through the use of Microsoft's Bot Framework and some innovation we believe companies can train "Bots" to help employees find information more efficiently and mobile than ever before! If you don't believe, me have a look at this case study by Microsoft where they created a chatbot to assist educators to find training and teaching resources.
When you think of Bots, you may be thinking of "chatbots"... you're right, those are the same bots that would power your Voice UI. Bots are essentially software programs that automate tasks we would otherwise ourselves. There have always been simple bots that provide a very specific service; However, most recently, we are seeing the emergence of intelligent bots that can serve more complex use-cases.
All that is needed is a voice layer for these bots and we're in business, right? Perhaps, but once you layer on Voice, you need to be much more accurate in your responses for it to have utility... 99% accurate according to Google.. for that you need machine learning, combined with a strong foundation in information architecture and data science.
Machine Learning is a must...
You need some machine learning because information may need to be summarized or analyzed to provide an answer before it's sent back to the person asking for information. Without it, you'll just get a list of search results displayed on the screen.... this might be ok in some cases, but to be relavent a number of factors from the user's profile must be taken into account before coming up with the recommended response such as:
Did you catch, that I said "Recommended Response"?
This is very important in the workplace because we need to ensure accountability and that people are making decisions... not machines. The machine should present a response as a recommendation in almost every case unless it's a know fact (e.g. what time is it?)
Types of Bots
Consumer focused bots like Amazon Echo, Google Now, Cortana and Siri are not really used that often in the real-world. Don't get me wrong, they are very great at what they do but they have yet to make things any easier for people... mostly because they are trying to do too much in our opinion. That said, they are paving the way for the emergence of this type of service and Voice UI in the workplace.
Business focused bots are more widely used and everywhere... These bots are available through platforms and interfaces like Slack, Skype, Microsoft Teams, website chat windows, email assistants, etc. Today, these bots are focused on solving specific collaboration, replacing/augmenting emails, information assistants, support, and speeding up decision-making/communications.
Here are a few examples:
Let's face it, SSL encryption is important for any website even if you're not handling credit card transactions. There are a few challenges with SSL though; it can be expensive, renewals are a pain and there is a lot of server configuration required to make everything work well.
When you're using Amazon Web Services another challenge comes up which can be difficult to mitigate with traditional SSL certs; when you want to start autoscaling your infrastructure you could end up with an AMI that has an expired cert on it quite easily.. Fortunately Amazon has thought of this and is here to help us out... welcome the Amazon Classic Load Balancer with an HTTPS Listener..
What is a Classic Load Balancer?
The load balancer serves as a single point of contact for clients. This allows for high availability of distributing load across multiple servers and gives you the opportunity to add and remove instances from your load balancer as your needs change, without disrupting the overall flow of requests to your application.
In addition, you can even get redundancy across multiple EC2 instances in multiple Availability Zones. This further increases the fault tolerance of your applications.
What is an HTTPS Listener?
The listener is part of your load balancer that listens on both the HTTP (80) and HTTPS (443) ports. What makes this interesting for us is that we can listen on an HTTPS (443) port and route traffic to any of our EC2 instances who are listening on HTTP (80).. this means we do not need to manage SSL on any of our web servers because all their traffic is routed through the listener.
Setting up the Load Balancer
Amazon has a great article on this so I'm going to refer you to that to get things setup and start taking advantage of this great service.
Setting up a Certificate
So, here is the magic.. you need to request a certificate for your load balancer using the AWS Certificate Manager. ( https://aws.amazon.com/certificate-manager/) This removes the time-consuming manual process of purchasing, uploading, and renewing SSL/TLS certificates. With AWS Certificate Manager, you can quickly request a certificate, deploy it on AWS resources such as Elastic Load Balancers or Amazon CloudFront distributions, and let AWS Certificate Manager handle certificate renewals. SSL/TLS certificates provisioned through AWS Certificate Manager are free. You pay only for the AWS resources you create to run your application.
Once you setup a certificate you can associate it to your HTTPS Listener and your in business.. no more third-party certificates needed.. all basically for free. Here is some direction to get you started: http://docs.aws.amazon.com/acm/latest/userguide/gs-elb.html
Tip: Updating your DNS
Because the set of IP addresses associated with a LoadBalancer can change over time, you should never create an "A" record with any specific IP address. If you want to use a friendly DNS name for your load balancer instead of the name generated by the Elastic Load Balancing service, you should create a CNAME record for the LoadBalancer DNS name, or use Amazon Route 53 to create a hosted zone
What is LTI?
Learning Tools Interoperability® (LTI®) is a specification developed by IMS Global Learning Consortium to establish a standard way of integrating learning tools with learning systems. It's widely adopted by almost all learning management systems!
Developers have adopted this standard because it allows for somewhat seemless integration with third-party content and tools.. and we all know how much of this there is on the internet!!
Another interesting concept about LTI is that the tool provider can send outcome data back to the consumer such as results form an assessment or activity. This type of data exchange can be invaluable to an LMS because not only can third-party content be embedded in a course, so can assessments.
How TEAMS uses LTI for integration
Recently our team wrapped up some enhancements to TEAMS (www.teams-hub.com) to incorporate support for LTI. Using TEAMS with your LMS now makes it easy to share content, assign resources to students, and track grades and progress.
The LTI specification is available on the IMS website at: http://www.imsglobal.org/lti/.
Today the U.S. Department of Education released detailed data for college debt, graduation rates, test scores, and more. There is a great website to explore the data and even a front-facing College Scorecard that lets you look up information for your university.
You can download the data as a single ZIP file, access it via the data.gov API, and most importantly, there's documentation.
Great news to see this data made available to the public!!
QC Technology Decisions Inc. is pleased to offer a new service that addresses one of the most common issues faced by companies wanting business intelligence. For as little as $225/month customers will have a dedicated resource to build and maintain your PowerBI or Cognos Insight dashboards.
While Software as a Service (Saas) solutions for business intelligence frameworks are providing new and exciting options to get new technologies faster; Resource constraints make it very difficult to keep pace with changes in these technologies.
Development as a Service (DevaaS) options from QC can give customers flexibility and reduces risk when choosing the direction they wish to take regarding business intelligence. In addition to having very little up-front costs, our services have the following benefits:
QC combines technology and labor in efficient, effective ways so we can spring into action when needed. You simply pay a flat monthly fee for each product we develop, this covers the development and on-going support costs.
The monthly fee assumes that you already have licensing for the technology and will provide access to our consultants so that we may develop and support the products within your development, test and/or production environments. All work will be completed by a senior consultant remotely and confidentially.
We guarantee our services, if you don’t like the service for any reason you simply cancel your subscription.
For any dashboard to be useful the tool needs the ability to connect to real data. IBM and Microsoft technologies can connect to a variety of systems to pull data into their business intelligence tools. If you have not setup any framework for business intelligence yet QC can get you started with for as little as $4,500.
We recently read a post by Peter Ku in June titled Are you ready for the “Data Tsunami” from the Affordable Health Care Ruling and found it extremely interesting although my impression this is going to be more of an EARTHQUAKE than a tsunami.
Our team members have worked in health care for a number of years and they can all tell you that there is definitely no shortage of data. This data is building up under our feet every day in hospitals, clinics and health care offices around the globe.. in essence the data is already out there.. hidden.. waiting to crack the surface!!!
Peter’s absolutely right in that there will soon be a huge inflow of new enrollments in the US related to the latest affordable health care ruling. This is going to rock and test the foundation each State has laid in connecting health information and put to the test both the technology and services that have been developed. One thing is certain, if states have not moved past governance decisions at this point and established standard process/procedures for managing the information then we are sure to see more news reports soon of failed health exchanges.
Kudos to Peter for putting together such a great article that highlights the importance of data integration, security and data quality!! We have been working with Informatica tools for many years and the are, without a doubt, leaders in their field.
Our belief is that data is complex only because organizations lack the tools to manage it properly. When managed properly understanding what to do with it becomes much easier and essentially starts to be managed like any other commodity or resource; Some pioneers such as Facebook and google have found the formula to monetize it early, but the rest of the world is catching up!
Tsunami or Earthquake…. QC Technology Decisions Inc., can help prepare your business with a solid foundation and understanding so you can remain competitive. If you are interested in learning more please contact us today!
In recent years there has been a continued focus on providing patients with intelligence devices and connected devices with the promise of improved health and a reduction of costs. With indicators such as the strong growth of the telemonitoring industry (3 Billion[i]), the growth of personal health devices, and emphasis on IoT (Internet of Things) projects it’s safe to say that there are tremendous opportunities in this market space.
What is driving this industry now?Further driving innovation in this area, particularly in the US, are government programs that penalize health care providers for readmissions. We believe there may be opportunities for a wide range of telemonitoring, closely monitoring patients whom have recently had cardiac care and those with diabetes and/or hypertension has shown some promise in reducing readmission rates. The potential for solutions that provide cost avoidance is very real and happening now.
Although long term goals are that telemonitoring provides a solution to readmission issues in the US. Similar to monitoring patients in a hospital, monitoring patients from home (or extended care facility) is still very workflow intensive. At home, there is a strong dependency on patients to report information on their vitals on a regular basis, only systems that include the “human element” seem to have shown success. For this reason, telemonitoring programs are expected to struggle somewhat in the next few years until device usability improves and the costs for technology and workflow reduces to a point where it is cheaper for health providers to outsource these services. That said, patients who are able and have the discipline to embrace the telemonitoring programs do see improvements do have an overall feeling of control and empowerment over their health.
What are the current trends?
Telehealth has been a part of modern health care systems for many years, and has traditionally relied on expensive and specialized equipment to connect providers and patients. Although telemonitoring is somewhat of a new concept and the market is still emerging, We expect that telemonitoring will drive down the cost of telehealth and open new opportunities for connected health device manufacturers. Still, the industry is dominated by large players in the market such as Honeywell, GE, and Phillips which will most likely retain their grip on major health providers such as hospitals and government agencies. We expect that within 2-3 years most (if not all) major medical centers in the US will offer telemonitoring services directly to their patients.
One trend that we see growing (and one that would support wider range of service providers) is best represented by a company called Nurse Next Door who are using telemonitoring technologies to give them both a competitive advantage and improve home care services to their patients. Companies such as AMC Health and CareCentrix supports these initiatives by providing telemonitoring as service to home care agencies, clinical trials, accountable care organizations and integrated health systems. Companies such as STI represents companies specializing in telemonitoring to provide no-hassle rental programs, central station medial alert monitoring and back-end services for cleaning, repair, calibration and maintenance of equipment.
It’s important not to exclude the rise of personal health devices. These consumer devices are all the rage these days and although they are mostly ineffective and inaccurate when it comes to vitals monitoring, their value appears to be in self-awareness and connectiveness. This self-awareness in personal health monitoring is beginning to create a culture of future patients (aka “the connected patient”) who will most likely demand telemonitoring so they continue feel empowered. It’s expected that the trend of personal health devices will continue and that telemonitoring device hubs will support connecting some of these devices in addition to other clinical monitoring devices the patient may need in their home.
In the US, there are almost 30,000 home health care service establishments; California, Texas and Florida have the most establishments. In Canada, there are comparatively less service providers having only 1,285 establishments.
In Canada, there are approximately 60 companies registered as ICT telemonitoring companies indicating that this is still a relatively small market in Canada. The closet segment to this in the USA industry data is the “All other miscellaneous ambulatory health care services” category, of which there are 4,000 companies.
The market leader in home monitoring which includes software for their devices is Honeywell and their Lifestream product line; However, there are still only a few companies that provide off the shelf products for telemonitoring such as Cardiocom, Home Health Hub and Authentidate that we came across in our research which indicates that the industry requires significant ICT support from companies to bridge the gap by developing custom solutions for customers.
We are always on the lookout for emerging trends in leveraging the vast amounts of data systems are collecting around the world. If you are interested in learning more about where this is industry is headed or would like assistance in your next real-time data integration project please contact us anytime.
Recently we had a great opportunity to work with one of our clients who was implementing a new student information system (Infinite Campus if you’re wondering which one). If you’ve done this before, congratulations, you know what an undertaking it can be!
Our role was to develop an integration architecture that would meet their immediate needs of replacing about 20 integration points between their other systems while setting the foundation that will meet their long term needs for data warehousing, point-to-point systems integration, and data import/export. Oh, I forgot to mention our client was a school district on a tight budget!!
We’re familiar with quite a few integration platforms such as Informatica, Oracle, and IBM… each having great integration functionality; But with a hefty price tag. Microsoft also offers an enterprise class integration platform…. included with SQL Server at < $2,000 per core!! SQL Server 2012 Integration Services was our choice for our client.
At first, I think our client was a bit reluctant at the architecture because it was a bit out of their comfort zone. Most school districts simply script their integration points using what’s available to them (Perl, SQL, VB Script, Excel, Access, etc.) and don’t invest in good integration software. Moving away from the “tried and true” is always a bit difficult, but the benefits using software designed to do integration will always beat out any home-grown solution. But in the end, with some guidance and good examples, they found their feet and are now starting to see this type of architecture will pay off for them in the long run.
The main reason for our success in gaining our client’s trust (and ultimately adoption for the system) was that we were able to line up all the components and fit it into an easy to understand process. Here is what we defined for them for developing exports (data extracts) using SQL Server Integration Services.
Step 1 – Determine the Requirements
It goes without saying, don’t start a project unless you know what you need to do. The complexity any the export package will depend on how complex your transformation requirements are. Some packages can take minutes to develop, whereas others could take days. The requirements should define both the “sources” and “targets” (destination) for your export. and if you need to filter or convert data, they must define the logic (business rules) required to transform the data into the correct target format.
Step 2 – Create/Open a SSIS Project
Developing an export requires that you install SQL Data Tools (from the SQL Server installation disk) first. Once installed, you can create your project and create one or more packages. We recommended that you create a separate project for each “target” system.
Step 3 – Create a new Package and Develop the Export
The outcome of the development is going to be one or more export packages. We ended up creating a few templates for our client to get them started so they could see how to build exports in a variety of formats and destinations.
Step 4 – Test your Export
SQL Data tools includes a debugger which lets you run and debug your packages by running the package locally.
Step 5 – Deploy your SSIS Project
Once you have a working package you can use the “Deploy” option in SQL Data Tools to deploy your package to your Integration Services Catalog. If you’re not using the integration services catalog then you’re probably going to run into configuration issues down the road… so it’s highly recommended. With it, you will be able to audit all your package executions, automatically version your projects, and schedule them with confidence that they will run just like you tested
Step 6 – Schedule your Package to Run Automatically
Finally, we want to automate our exports. This is done by using the SQL Agent to schedule SSIS packages to run on a schedule you set. When you schedule these you can have SQL notify you if something goes wrong too!
Our vision is to be a leader of best practice business intelligence and information management solutions that that improve performance and communication between decision makers, administrators, employees, and the public. We deliver sustainable solutions for our clients by creating scalable designs that meet both short term (immediate) and long term requirements.
Principal consultants at QC are information experts specializing in business intelligence and enterprise information management. We are committed to providing the best service possible for our clients by offering only qualified and experienced consultants that meet their needs.
If you are interested in learning more about SQL Server Integration Services or any of our other services please contact us today!
Shane Quigley is an expert in data warehousing, business intelligence, systems analysis, and solution architecture.