QC Technology Decisions Inc.
  • About QC
  • Professional Services
  • Contact

6 Business Intelligence Trends to Watch For in 2019

11/18/2018

0 Comments

 
"The goal of business intelligence (BI) is to thoughtfully and purposefully collect and analyze past information to support an organization and make better decisions about it. As the new year approaches, 2019 business intelligence trends are creating a buzz. Even though the reasonwhy companies engage in business intelligence remains relatively consistent from year to year, the ways those establishments go about it differ over time. Here are six business intelligence trends likely to play out in 2019." ( from the Smart data collective)

6 Business Intelligence Trends to Watch For in 2019 https://www.smartdatacollective.com/business-intelligence-trends-to-watch-for-2019/
0 Comments

Voice is the next UI

4/24/2017

0 Comments

 

Voice as a User Interface?

Picture
In the world of information management most are concerned with one important thing...  how to store information so users can find it!!   Most companies are getting good at this, not great, but good... and they are getting better each year.   Unfortunately, they are dealing with even more information than every before:
  • Decentralized news and documents are being aggregated through enterprise search portals every minute
  • ​Social information is being generated every second with enterprise social networks (ESNs) and microblogs
  • Real time IoT information is being transmitted across networks
In the world of big data, we call this data "Velocity"..  and its speeding up which means information is going to become harder and harder to find unless we don't change how we FIND it.     It's said that we speak on average 150 wpm and type 40 ... you do the math...  it's faster to talk than type!!

Microsoft, Google, Amazon and Apple have all demonstrated that people can interface with their services using only their Voice.   Of course, us humans are only coming around to fully adopting this as part of our daily lives.... but that is because there are still some barriers.    Eventually, those barriers will be removed by some innovation so why not plan for the future and start looking at a Voice UI for Employees to help them get the information they need to do their job better.

Creating a Voice UI for your Enterprise?

Through the use of Microsoft's Bot Framework and some innovation we believe companies can train "Bots" to help employees find information more efficiently and mobile than ever before!    If you don't believe, me have a look at this case study by Microsoft where they created a chatbot to assist educators to find training and teaching resources.

When you think of Bots, you may be thinking of "chatbots"...  you're right, those are the same bots that would power your Voice UI.   Bots are essentially software programs that automate tasks we would otherwise ourselves. There have always been simple bots that provide a very specific service; However, most recently, we are seeing the emergence of intelligent bots that can serve more complex use-cases.

All that is needed is a voice layer for these bots and we're in business, right?  Perhaps, but once you layer on Voice, you need to be much more accurate in your responses for it to have utility... 99% accurate according to Google..  for that you need machine learning, combined with a strong foundation in information architecture and data science.    ​
CONTACT US TO DESIGN YOUR VOICE UI

Machine Learning is a must...

You need some machine learning because information may need to be summarized or analyzed to provide an answer before it's sent back to the person asking for information.  Without it,  you'll just get a list of search results displayed on the screen....  this might be ok in some cases, but to be relavent a number of factors from the user's profile must be taken into account before coming up with the recommended response such as:
  • ​Location
  • Language
  • Job Title
  • And others...

Did you catch, that I said "Recommended Response"?   

​This is very important in the workplace because we need to ensure accountability and that people are making decisions... not machines.   The machine should present a response as a recommendation in almost every case unless it's a know fact (e.g.  what time is it?)  ​

Types of Bots

Consumer focused bots like Amazon Echo, Google Now, Cortana and Siri are not really used that often in the real-world.   Don't get me wrong, they are very great at what they do but they have yet to make things any easier for people...  mostly because they are trying to do too much in our opinion.     That said, they are paving the way for the emergence of this type of service and Voice UI in the workplace.

​Business focused bots are more widely used and everywhere... These bots are available through platforms and interfaces like Slack, Skype, Microsoft Teams, website chat windows, email assistants, etc.   Today, these bots are focused on solving specific collaboration, replacing/augmenting emails, information assistants, support, and speeding up decision-making/communications.

Here are a few examples:
  • x.ai and Clara Labs provide a virtual assistant to help you set up and manage your meetings.
  • Gong.io and Chorus provide a bot that listens in on sales calls and uses voice-to-text and other Machine Learning algorithms to help your sales teams get better and close more deals.
  • Astro is building an AI-assisted email app that will have multiple interfaces including voice (Echo).
  • Twyla is helping make chatbots on websites more intelligent using ML. It integrates with your existing ZenDesk, LivePerson, or Salesforce support.
  • Clarke.ai uses AI to take notes for your meeting so you can focus better.
  • Smacc provides AI-assisted, automated bookkeeping for SMBs.
  • Slack is one of the fastest growing SaaS companies and has the most popular bot store
  • Amazon Lex Amazon Lex is a service for building conversational interfaces into any application using voice and text.    (Hackathon:  AWS Chatbot Challenge:   April 19 - Jul 18, 2017)
  • Microsoft Bot Framework ​ let's developers build and connect intelligent bots to interact with your users naturally wherever they are — from your website or app to text/SMS, Skype, Slack, Facebook Messenger, Office 365 mail, Teams and other popular services.     (
  • Telegram Bot API is an HTTP based interface keen on building bots for Telegram  ($1M in Bot Prizes through 2017)
  • Chatbots.io   Build conversational AI for applications
  • Wire  End to end Encrypted Bot API
CONTACT US TO DESIGN YOUR VOICE UI
0 Comments

How to get SSL for Free with AWS

4/17/2017

0 Comments

 
Let's face it, SSL encryption is important for any website even if you're not handling credit card transactions.   There are a few challenges with SSL though;  it can be expensive, renewals are a pain and there is a lot of server configuration required to make everything work well.   

When you're using Amazon Web Services another challenge comes up which can be difficult to mitigate with traditional SSL certs; when you want to start autoscaling your infrastructure you could end up with an AMI that has an expired cert on it quite easily..     Fortunately Amazon has thought of this and is here to help us out...   welcome the Amazon Classic Load Balancer with an HTTPS Listener.. 

What is a Classic Load Balancer?

The load balancer serves as a single point of contact for clients. This allows for high availability of distributing load across multiple servers and gives you the opportunity to add and remove instances from your load balancer as your needs change, without disrupting the overall flow of requests to your application.

In addition, you can even get redundancy across multiple EC2 instances in multiple Availability Zones. This further increases the fault tolerance of your applications.

​What is an HTTPS Listener?

The listener is part of your load balancer that listens on both the HTTP (80) and HTTPS (443) ports.   What makes this interesting for us is that we can listen on an HTTPS (443) port and route traffic to any of our EC2 instances who are listening on HTTP (80)..    this means we do not need to manage SSL on any of our web servers because all their traffic is routed through the listener.

Setting up the Load Balancer

​Amazon has a great article on this so I'm going to refer you to that to get things setup and start taking advantage of this great service. 

http://docs.aws.amazon.com/elasticloadbalancing/latest/classic/elb-create-https-ssl-load-balancer.html

​Setting up a Certificate

​So, here is the magic..   you need to request a certificate for your load balancer using the AWS Certificate Manager. ( https://aws.amazon.com/certificate-manager/​) This removes the time-consuming manual process of purchasing, uploading, and renewing SSL/TLS certificates. With AWS Certificate Manager, you can quickly request a certificate, deploy it on AWS resources such as Elastic Load Balancers or Amazon CloudFront distributions, and let AWS Certificate Manager handle certificate renewals. SSL/TLS certificates provisioned through AWS Certificate Manager are free. You pay only for the AWS resources you create to run your application.

​Once you setup a certificate you can associate it to your HTTPS Listener and your in business..   no more third-party certificates needed..   all basically for free.   Here is some direction to get you started:  http://docs.aws.amazon.com/acm/latest/userguide/gs-elb.html​


​Tip: Updating your DNS

Because the set of IP addresses associated with a LoadBalancer can change over time, you should never create an "A" record with any specific IP address. If you want to use a friendly DNS name for your load balancer instead of the name generated by the Elastic Load Balancing service, you should create a CNAME record for the LoadBalancer DNS name, or use Amazon Route 53 to create a hosted zone

0 Comments

Learning Tools Interoperability (LTI)

7/7/2016

0 Comments

 

What is LTI?

Learning Tools Interoperability® (LTI®) is a specification developed by IMS Global Learning Consortium to establish a standard way of integrating learning tools with learning systems.   It's widely adopted by almost all learning management systems!
​
​Developers have adopted this standard because it allows for somewhat seemless integration with third-party content and tools..  and we all know how much of this there is on the internet!!​
Picture
​IMS Global describes this interaction nicely as "if you have an interactive assessment application or virtual chemistry lab, it can be securely connected to an educational platform in a standard way without having to develop and maintain custom integrations for each platform"

​The Tool Consumer  (eg.  the LMS) consumes the learning tool and the Tool Provider  (eg.  premium content) provides it.
The nature of the relationship between the tool consumer and the provider is that the provider delegates responsibility for authentication and authorization to the consumer;  While the consumer provides data about the user and  the context where they have launched the tool.  
Another interesting concept about LTI is that the tool provider can send outcome data back to the consumer such as results form an assessment or activity.   This type of data exchange can be invaluable to an LMS because not only can third-party content be embedded in a course,  so can assessments. 

How TEAMS uses LTI for integration

Recently our team wrapped up some enhancements to TEAMS (www.teams-hub.com) to incorporate support for LTI. Using TEAMS with your LMS now makes it easy to share content, assign resources to students, and track grades and progress.  

More Information

The LTI specification is available on the IMS website at: http://www.imsglobal.org/lti/.

0 Comments

Detailed data release for U.S. college debt, graduation rate, and test scores  

9/14/2015

0 Comments

 
Picture
Today the U.S. Department of Education released detailed data for college debt, graduation rates, test scores, and more.  There is a great website to explore the data and even  a front-facing College Scorecard that lets you look up information for your university.

You can download the data as a single ZIP file, access it via the data.gov API, and most importantly, there's documentation.

Great news to see this data made available to the public!!
0 Comments

New Development as a Service (DevaaS) Offering

9/2/2015

0 Comments

 
QC Technology Decisions Inc. is pleased to offer a new service that addresses one of the most common issues faced by companies wanting business intelligence. For as little as $225/month customers will have a dedicated resource to build and maintain your PowerBI or Cognos Insight dashboards.

Subscription Benefits

While Software as a Service (Saas) solutions for business intelligence frameworks are providing new and exciting options to get new technologies faster; Resource constraints make it very difficult to keep pace with changes in these technologies.

Development as a Service (DevaaS) options from QC can give customers flexibility and reduces risk when choosing the direction they wish to take regarding business intelligence. In addition to having very little up-front costs, our services have the following benefits:

  • Fixed monthly cost for per Dashboard is easy to budget and communicate to stakeholders
  • Provides support for self-serve business intelligence
  • Scale solutions without impacting your existing resources
  • Allows existing resources to work on new projects
  • Ensures dashboards can evolve with the business needs

QC combines technology and labor in efficient, effective ways so we can spring into action when needed. You simply pay a flat monthly fee for each product we develop, this covers the development and on-going support costs.

Getting Started

The monthly fee assumes that you already have licensing for the technology and will provide access to our consultants so that we may develop and support the products within your development, test and/or production environments. All work will be completed by a senior consultant remotely and confidentially.
Subscribe now
We guarantee our services, if you don’t like the service for any reason you simply cancel your subscription.

For any dashboard to be useful the tool needs the ability to connect to real data. IBM and Microsoft technologies can connect to a variety of systems to pull data into their business intelligence tools. If you have not setup any framework for business intelligence yet QC can get you started with for as little as $4,500.
request an estimate
0 Comments

Managing Quality in an Education Data Warehouse

8/14/2015

0 Comments

 
Modern Education Data Warehouses require that data be conformed across multiple sources and in some cases even across multiple state agencies such as Early Childhood Development, Public Education (K12) Agencies, Private Education Agencies, Higher Education Agencies and Workforce Development.   Emerging trends in “Big Data” indicate that agencies will want to incorporate other data such as public data sets to perform statistical analysis to improve the Educational System.

Education Data Warehouses need to maintain accurate sets data and cohorts over time that are sourced from a variety of systems.   This poses a unique challenge with data integration and quality control to create a sustainable and efficient process capable of moving data faster into the hands of the decision makers that need it.

Enterprise data integration capabilities are the foundation for any data warehousing solution. Using reliable, enterprise class technology from Informatica, the solution accesses, integrates, and delivers data of any volume, for any application from virtually any source in any format at any latency.  Armed with these capabilities, IT organizations can break down the silos wherever data is held enabling seamless data sharing across multiple state agencies and commissions.

Although education agencies (both local and at the state level) have developed have developed processes to match student records, generate IDs, and conform data to meet their unique data warehousing requirements; they continue to struggle with data collection processes.   This bottleneck is a barrier to “Velocity”, which when removed and by definition of the three V’s (Volume, Velocity, and Variety), would bring Education Data Warehousing into the “Big Data” arena.

Addressing Data Quality Challenges

Data quality is not a one-time effort; However, in traditional data warehousing projects a significant (if not most of their time) is spent cleansing and organizing the data in a way that makes sense to business users.  The events and changes that allows data anomalies to be introduced into an environment are not unique; however, addressing anomalies becomes critically important when users are relying on accurate student records and demographics for making decisions. It is necessary for the data management teams to not just address acute data failures, but also baseline the current state of data quality so that one can identify the critical failure points and determine improvement targets.

The ability to monitor data quality and react quickly to changes demonstrates a level of organizational maturity that views information as an asset and rewards proactive involvement by delivering on the promises of business intelligence;   Data Trust,  Business Value,  and Process Alignment.

Address data quality challenges through:
  • Assessment
  • Definition
  • Validation and Cleansing
  • Monitoring and Managing Ongoing Quality of Data

Quality Scorecards

A data quality scorecard is a management tool that captures a virtual snapshot of the quality levels of your data, presents that information to the user, and provides insight as to where data flaws are impacting business operations and where the most egregious flaws exist within the system. Using data quality rules based on defined dimensions provides a framework for measuring conformance to business data quality expectations.

Your quality score card helps control:

  • Validity of Data
  • Thresholds for Conformance
  • Ongoing Process Control
  • Proactive Monitoring and Alerting
  • Data Standardization


Although this post refers to an education data warehouse, the same principles could be applied to virtually any data warehouse to ensure data quality.   For additional information please contact us anytime!
Picture
Accelerating Education Data Warehousing with Informatica Data Quality. A framework to define, control and monitor quality of data.
Download Whitepaper
0 Comments

Data Tsunami… more like an earthquake!!!

7/21/2015

0 Comments

 
We recently read a post by Peter Ku in June titled Are you ready for the “Data Tsunami” from the Affordable Health Care Ruling and found it extremely interesting although my impression this is going to be more of an EARTHQUAKE than a tsunami.

Our team members have worked in health care for a number of years and they can all tell you that there is definitely no shortage of data.  This data is building up under our feet every day in hospitals, clinics and health care offices around the globe.. in essence the data is already out there.. hidden.. waiting to crack the surface!!!

Peter’s absolutely right in that there will soon be a huge inflow of new enrollments in the US related to the latest affordable health care ruling. This is going to rock and test the foundation each State has laid in connecting health information and put to the test both the technology and services that have been developed.   One thing is certain, if states have not moved past governance decisions at this point and established standard process/procedures for managing the information then we are sure to see more news reports soon of failed health exchanges.

Kudos to Peter for putting together such a great article that highlights the importance of data integration, security and data quality!!   We have been working with Informatica tools for many years and the are, without a doubt, leaders in their field.

Our belief is that data is complex only because organizations lack the tools to manage it properly. When managed properly understanding what to do with it becomes much easier and essentially starts to be managed like any other commodity or resource;  Some pioneers such as Facebook and google have found the formula to monetize it early, but the rest of the world is catching up!

Tsunami or Earthquake….  QC Technology Decisions Inc., can help prepare your business with a solid foundation and understanding so you can remain competitive.    If you are interested in learning more please contact us today!

0 Comments

Telemonitoring… is now the time?

7/19/2015

0 Comments

 
In recent years there has been a continued focus on providing patients with intelligence devices and connected devices with the promise of improved health and a reduction of costs.   With indicators such as the strong growth of the telemonitoring industry (3 Billion[i]), the growth of personal health devices, and emphasis on IoT (Internet of Things) projects it’s safe to say that there are tremendous opportunities in this market space.

What is driving this industry now?Further driving innovation in this area, particularly in the US, are government programs that penalize health care providers for readmissions.    We believe there may be opportunities for a wide range of telemonitoring, closely monitoring patients whom have recently had cardiac care and those with diabetes and/or hypertension has shown some promise in reducing readmission rates.   The potential for solutions that provide cost avoidance is very real and happening now.

Although long term goals are that telemonitoring provides a solution to readmission issues in the US.  Similar to monitoring patients in a hospital, monitoring patients from home (or extended care facility) is still very workflow intensive.   At home, there is a strong dependency on patients to report information on their vitals on a regular basis, only systems that include the “human element” seem to have shown success.   For this reason, telemonitoring programs are expected to struggle somewhat in the next few years until device usability improves and the costs for technology and workflow reduces to a point where it is cheaper for health providers to outsource these services.      That said, patients who are able and have the discipline to embrace the telemonitoring programs do see improvements do have an overall feeling of control and empowerment over their health.

What are the current trends?

Telehealth has been a part of modern health care systems for many years, and has traditionally relied on expensive and specialized equipment to connect providers and patients.   Although telemonitoring is somewhat of a new concept and the market is still emerging,  We expect that telemonitoring will drive down the cost of telehealth and open new opportunities for connected health device manufacturers.  Still, the industry is dominated by large players in the market such as Honeywell, GE, and Phillips which will most likely retain their grip on major health providers such as hospitals and government agencies.    We expect that within 2-3 years most (if not all) major medical centers in the US will offer telemonitoring services directly to their patients.

One trend that we see growing (and one that would support wider range of service providers) is best represented by a company called Nurse Next Door who are using telemonitoring technologies to give them both a competitive advantage and improve home care services to their patients.    Companies such as AMC Health  and CareCentrix supports these initiatives by providing telemonitoring as service to home care agencies, clinical trials, accountable care organizations and integrated health systems.  Companies such as STI  represents companies specializing in telemonitoring to provide no-hassle rental programs, central station medial alert monitoring and back-end services for cleaning, repair, calibration and maintenance of equipment.

It’s important not to exclude the rise of personal health devices.  These consumer devices are all the rage these days and although they are mostly ineffective and inaccurate when it comes to vitals monitoring, their value appears to be in self-awareness and connectiveness.   This self-awareness in personal health monitoring is beginning to create a culture of future patients (aka “the connected patient”) who will most likely demand telemonitoring so they continue feel empowered.      It’s expected that the trend of personal health devices will continue and that telemonitoring device hubs will support connecting some of these devices in addition to other clinical monitoring devices the patient may need in their home.

Opportunities

In the US, there are almost 30,000 home health care service establishments; California, Texas and Florida have the most establishments.  In Canada, there are comparatively less service providers having only 1,285 establishments.

In Canada, there are approximately 60 companies registered as ICT telemonitoring companies indicating that this is still a relatively small market in Canada.  The closet segment to this in the USA industry data is the “All other miscellaneous ambulatory health care services” category, of which there are 4,000 companies.

The market leader in home monitoring which includes software for their devices is Honeywell and their Lifestream product line; However, there are still only a few companies that provide off the shelf products for telemonitoring such as Cardiocom, Home Health Hub and Authentidate that we came across in our research which indicates that the industry requires significant ICT support from companies to bridge the gap by developing custom solutions for customers.

We are always on the lookout for emerging trends in leveraging the vast amounts of data systems are collecting around the world.   If you are interested in learning more about where this is industry is headed or would like assistance in your next real-time data integration project please contact us anytime.

0 Comments

Who says integration has to be expensive!

6/1/2015

0 Comments

 
Recently we had a great opportunity to work with one of our clients who was implementing a new student information system (Infinite Campus if you’re wondering which one).   If you’ve done this before, congratulations, you know what an undertaking it can be!

Our role was to develop an integration architecture that would meet their immediate needs of replacing about 20 integration points between their other systems while setting the foundation that will meet their long term needs for data warehousing, point-to-point systems integration, and data import/export.   Oh, I forgot to mention our client was a school district  on a tight budget!!

We’re familiar with quite a few integration platforms such as Informatica, Oracle,  and IBM… each having great integration functionality;  But with a hefty price tag.    Microsoft also offers an enterprise class integration platform…. included with SQL Server at < $2,000 per core!!    SQL Server 2012 Integration Services was our choice for our client.

At first, I think our client was a bit reluctant at the architecture because it was a bit out of their comfort zone.   Most school districts simply script their integration points using what’s available to them  (Perl, SQL, VB Script, Excel, Access, etc.)  and don’t invest in good integration software.      Moving away from the “tried and true” is always a bit difficult, but the benefits using software designed to do integration will always beat out any home-grown solution.    But in the end, with some guidance and good examples, they found their feet and are now starting to see this type of architecture will pay off for them in the long run.

The main reason for our success in gaining our client’s trust (and ultimately adoption for the system) was that we were able to line up all the components and fit it into an easy to understand process.   Here is what we defined for them for developing exports (data extracts) using SQL Server Integration Services.

Step 1 – Determine the Requirements
It goes without saying, don’t start a project unless you know what you need to do.  The complexity any the export package will depend on how complex your transformation requirements are.  Some packages can take minutes to develop, whereas others could take days.  The requirements should define both the “sources” and “targets”  (destination) for your export. and if you need to filter or convert data, they must define the logic (business rules) required to transform the data into the correct target format.

Step 2 – Create/Open a SSIS Project
Developing an export requires that you install SQL Data Tools (from the SQL Server installation disk) first.   Once installed, you can create your project and create one or more packages.  We recommended that you create a separate project for each “target” system.

Step 3 – Create a new Package and Develop the Export
The outcome of the development is going to be one or more export packages.   We ended up creating a few templates for our client to get them started so they could see how to build exports in a variety of formats and destinations.

Step 4 – Test your Export
SQL Data tools includes a debugger which lets you run and debug your packages by running the package locally.

Step 5 – Deploy your SSIS Project
Once you have a working package you can use the “Deploy” option in SQL Data Tools to deploy your package to your Integration Services Catalog.   If you’re not using the integration services catalog then you’re probably going to run into configuration issues down the road…  so it’s highly recommended.   With it, you will be able to audit all your package executions, automatically version  your projects, and schedule them with confidence that they will run just like you tested

Step 6 – Schedule your Package to Run Automatically
Finally, we want to automate our exports.   This is done by using the SQL Agent to schedule SSIS packages to run on a schedule you set.   When you schedule these you can have SQL notify you if something goes wrong too!

Our vision is to be a leader of best practice business intelligence and information management solutions that that improve performance and communication between decision makers, administrators, employees, and the public. We deliver sustainable solutions for our clients by creating scalable designs that meet both short term (immediate) and long term requirements.

Principal consultants at QC are information experts specializing in business intelligence and enterprise information management.   We are committed to providing the best service possible for our clients by offering only qualified and experienced consultants that meet their needs.

If you are interested in learning more about SQL Server Integration Services or any of our other services please contact us today!

0 Comments
<<Previous

    Author

    Shane Quigley is an expert in data warehousing, business intelligence, systems analysis, and solution architecture.

    Archives

    November 2018
    April 2017
    July 2016
    September 2015
    August 2015
    July 2015
    June 2015
    September 2013
    January 2013

    Categories

    All
    Big Data
    Cognos
    DevaaS
    Education Data
    Healthcare
    Integration
    PowerBI
    SharePoint
    SQL Server

    RSS Feed


QC Technology Decisions Inc.

Quigley and Company Technology Decisions Inc.  is a group of experienced professionals who offer open and practical advice and technology solutions by applying critical thinking to our designs.   ​

info@qctechnology.com 

Services

Service Catalog

Portfolio Highlights

About us

About QC 

​Contact Us


​Blog
  • About QC
  • Professional Services
  • Contact