Posts

Showing posts from 2019

New features for Azure IoT Central

Image
Can your IoT solution grow with you? Are you protecting data on devices and in the cloud? Azure IoT Central is built with the properties of highly secure and scalable IoT solutions. New Azure IoT Central features available: 1. 11 new industry-focused application templates to accelerate solution builders across retail, healthcare, government, and energy industries. 2. Public APIs to access Azure IoT Central features for device modelling, provisioning, lifecycle management, operations, and data querying. 3. Management for edge devices and IoT Edge module deployments. 4. Seamless device connectivity with IoT Plug and Play. 5. Application export to enable application repeatability. 6. Extensibility from no/low code actions to data export to Azure PaaS services. 7. Manageability and scale through multitenancy for both device and data sovereignty without sacrificing manageability. 8. User access cont

Google To Acquire Looker For $2.6 Billion

Image
The deal is expected to close later this year, at which point Looker will become part of Google Cloud , helping serve Google customers a more comprehensive analytics solution. Data remains an untapped resource for many organizations and businesses. I believe the addition of Looker to Google Cloud will provide customers with a more comprehensive analytics solution,from ingesting and integrating data to gain insights, to embedded analytics and visualizations enabling enterprises to leverage the power of analytics, machine learning and AI. Whether that’s supply chain analytics in retailing; media analytics in entertainment; or healthcare analytics at global scale. The market for business intelligence software is large. Indeed in August 2018, IDC said that "Worldwide revenue for big data and business analytics (BDA) solutions was $166 billion, up 11.7% over 2017 and would reach $260 billion

Salesforce’s $15.7B Tableau Acquisition

Image
The data analytics race has just taken a very interesting turn with the Tableau acquisition by Salesforce. In 2003, Tableau set out to pioneer self-service analytics with an intuitive analytics platform that would empower people of any skill level to work with data. Now acquired by Salesforce. The acquisition is the largest in Salesforce history, and the company’s co-CEO, Marc Benioff, hinted at larger ambitions when he declared that Seattle would become Salesforce “HQ2” as a result of the Tableau deal. The West Coast is increasingly becoming a larger tech hub encompassing San Francisco, Silicon Valley and Seattle. This acquisition combines a CRM and an analytics platform. Tableau helps people see and understand data, and Salesforce helps people engage and understand customers. “Joining forces with Salesforce will enhance our ability to help people everywhere see and understand data,” added

Google Sheets with Tableau

Image
Got your data in Google Sheets? You can connect directly to your data in Tableau using it's web data connector. This feature has been around for a while now. If you are new to it, this could be a plus for you. Select the option under “More Servers…” on the Connect menu. After entering your Google Sheets credentials, you will see the list of available sheets. You can Select from the list or use the search bar to be specific, and click “connect.” What's cool about it is that If you aren’t sure if this is exactly the sheet you are looking for, you can easily open it in a web browser by selecting the “open in Google Drive”. Once the sheet has been loaded into Tableau, you can drag out the individual sheets from your Google Sheet and join or union them together. Additionally, you can also union and join Google Sheets with other data sources.Sounds cool right?. In Tableau, you can see all th

AWS: Benefits of using Amazon S3

Image
Let us start with a quick introduction. Amazon simple storage service or Amazon S3 provides developers / IT teams / Organizations / Individuals with secure, available, price friendly object storage in the cloud. Ok. Slow down. What is object storage? It is a computer data storage architecture that manages data as objects, as opposed to other storage architectures like file systems, which manage data as a file hierarchy, and block storage, which manages data as blocks within sectors and tracks. Amazon S3 is easy to use and can be used to store and retrieve any amount of data at any time. All data transfers are over automatic encryption and SSL. Data is secure, available when needed and will scale as your needs grow. Last thing you want to worry about is losing valuable data. Amazon S3 automatically replicates your objects on multiple devices across multiple facilities. In Amazon S3, you create bucke

Sales Report: Superstore (Tableau Data Visualization)

Image

Score Board (Tableau Data Visualization)

Image

Microsoft Azure: Choosing Blob Storage vs Data Lake Store

Image
In today’s post, I will like to talk about considerations for choosing to use Azure Blob Storage or Azure Data Lake Store when processing data to be loaded into a data warehouse. Here is a Data Warehouse Architecture published by Microsoft, where it suggests loading data from your source into Azure Blob Storage. Pause for a second!!! Ok, Let's continue!!! So, here are my thoughts on why you may choose one over the other based on my experience in some projects. It really "depends".In most cases you can’t go wrong either way because they are both powerful storage systems.Let's go: Firstly, Text files. ADLS is better with text files compared to ABS.When you talk about non-text data like media files,database backup files etc, are better off with ABS.There are trade-offs with both. Secondly, Geographic redundancy. ABS gives you that out of the box. For ADLS,i

Azure Data Factory: Introduces Templates

Image
Microsoft recently introduced the use of templates in Azure Data Factory(ADF). I believe this is a very solid implementation. By having this addition, data engineers can easily get started quickly with building data factory pipelines and improve developer productivity along with reducing development time for same workflows repeatedly. Ready to explore?. Let's jump in for a cool ride: 1. Just click Create pipeline from template on the Overview page or click +-> Pipeline from template on the Author page in your data factory UX to get started. 2. Select any template from the gallery and provide the necessary inputs to use the template. You can also read detailed description about the template or visualize the end to end data factory pipeline. 3. You can also create new connections to your data store or compute while providing the template inputs. 4. Once you cli