Back to top

Technical Deep Dives & Tutorials

Carlos Maroto
A critical part of a data lake implementation is having effective mechanisms for the data to be copied from different repositories to the data lake. In this blog, our architect discusses potential challenges, best practices, and common methods for data acquisition as well as how to select the most appropriate implementation approaches for your data lake use cases.
Carlos Maroto
Data lakes bring together data from disparate sources, making it easily searchable in order to support your organization's information discovery, analytics, and reporting. But how do you put in place appropriate security measures to ensure your data is well-managed and protected? In this blog, our expert will discuss four key areas you should consider when implementing data lake security.
Derek Rodriguez
A data lake can be tremendously beneficial to organizations looking to acquire enterprise-wide content from multiple sources and extract insights from it. In a recent project for a pharmaceutical client, we leveraged Aspire for Big Data to ingest over one petabyte of unstructured content into their data lake. Read the firsthand story from our architect.
darren liu photo
As online users have become so accustomed to Google search, your shoppers will expect similar e-commerce site search quality. In this blog, our e-commerce expert will discuss some practical techniques on how to maintain and tune Oracle Endeca, a common e-commerce site search platform, for better engagement and conversion rates.
Matt Willsmore
Top quality search accuracy is not achieved with technology alone or through a one-time “quick fix.” It can only be achieved with a careful, continuous improvement process that we refer to as Search Engine Scoring. In this blog, we will share practical details and best practices learned from a real-world search engine scoring process for SharePoint Online.
mark stanger
Search algorithm optimisation is a complementary tool for search engine scoring, both of which play a key part in the process of enhancing your search engine's performance. See how we optimised the search algorithm to improve the search relevance for Elsevier's DataSearch application.
Paul Nelson
Quality analysis is critical in maintaining a data mining application's high performance. In this last part of our "Cruising the Data Ocean" blog series, Chief Architect, Paul Nelson, will discuss various quality analysis techniques and what you should consider when performing quality checks.
Paul Nelson
In part 5 of our "Cruising the Data Ocean" blog series, Chief Architect, Paul Nelson, provides an overview of some powerful search, analytics, and business intelligence (BI) applications you can build using the Internet data you have acquired and processed.
Paul Nelson
In part 4 of our "Cruising the Data Ocean" blog series, Chief Architect, Paul Nelson, provides a deep-dive into Natural Language Processing (NLP) tools and techniques that can be used to extract insights from unstructured or semi-structured content written in natural languages. Paul will introduce six essential steps (with specific examples) for a successful NLP project.
Paul Nelson
Data preparation is a critical step in your data mining project. In this third blog of our "Cruising the Data Ocean" series, Chief Architect, Paul Nelson, provides an overview of the data cleansing and formatting techniques you can use to get the highest-quality data for further processing and analysis.