We previously shared our views linking AI to PLM productivity and collaboration improvements. Here we dive deeper to share our experiences. AI in general and Machine Learning in particular have come a long way since Kiri Wagstaff’s 2012 paper ‘Machine Learning that Matters’ [1]. Google using it for identifying door numbers in street-view [2] images or Citigroup using it for payment outlier detection [3] are not isolated cases any longer. Application domains are at the core of AI activities within many companies and strategies for efficient deployment and upgrades are considered upfront.

Easy to deploy, easy to use

Developing AI solutions for the enterprise is certainly much more than developing an algorithm. Here, we will explain the challenges we encountered while working on our AI enabled search component integrated with TECHNIA Value Components for the 3DEXPERIENCE® platform from Dassault Systèmes, and the solution we developed.

There are two aspects of enterprise software which can determine their success and acceptance – ease of use for end users and ease of deployment and maintenance for administrators. These would seem alien to many Machine Learning engineers who develop solutions in Jupyter notebooks. However, Machine Learning in particular and AI in general have now reached a stage where data pipelines, deployment strategies, automated selection of candidate models, and information security are as important as the algorithm itself. Getting these right can put the product teams in the sweet spot where candidate model selection and deployment can go hand in hand, especially when the product has to be deployed for multiple customers based on their needs. This of course requires a team with a good mix of deep domain knowledge, AI expertise, deployment experience and overlap with the team at customers.

A showcase

Coming back to our solution for AI enabled search, the first question is about the need and not the technology. Anybody searching through the enterprise knowledgebase is often surprised by the difficulty in finding relevant artefacts. Why can’t the search experience on popular e-commerce websites be replicated in enterprise applications? And for people working in large multinational companies, a pertinent question is how to get relevant parts or documents which were originally created in a different language.

The short video below describes the problem and the effectiveness of our solution using a simple example on a real-world dataset.

Traditional search solutions in enterprise applications are built around adaptations of bag-of-words models. Methods like BM25 [4] have been around for more than 20 years, are very successful, and power popular search solutions [5]. But the limitations quickly show up when compared to recent solutions which can deal with ‘meaning’ in sentences. This is what appears in user demand stated in the query, ‘if I can find products so easily on e-commerce websites, why can’t I find the parts and documents similarly in my PLM system’. Additionally, the adaptations of bag-of-words models struggle in a multilingual setting. So, the question we asked was, ‘Can the existing search solution be augmented with state-of-the-art AI solutions?’ To our surprise, there is no robust off-the-shelf solution out there. So, we built one.

Our approach

We wanted to strike a balance between the relevant merits of traditional search solutions and the additional benefits of AI solutions.

Semantic information retrieval has seen major breakthroughs with the use of new AI solutions built around word embeddings [6] and Transformers [7]. Our challenge was to combine these with the existing search solution in 3DEXPERIENCE as shown in Fig.1 while delivering a simple deployment scenario for the admin, and a superior user experience for the end user.

Fig.1: Our approach combining existing keyword-based search with Natural Language Understanding and user satisfaction estimation

Simple to use, from trial till production

When it comes to deployment, we provide a simple containerized solution as shown in Fig. 2 where we can enable the customers to try multiple language models sequentially or in parallel and evaluate the results. We have built the solution using microservices, making it easy to scale and evaluate candidate models in customer environments. We enable customers to start small in experimentation and allow for scaling the solution to any infrastructure scenario.

Simple container orchestration configuration enables customers to securely access our container repositories (AWS ECR) so that they are then ready to experiment within their own environments. No data leaves the customer’s system at any point.

We provide an integrated solution for resource monitoring and logging for all microservices. We can also easily enable deployments on servers that cannot directly access our repositories, as the containers can be separately downloaded.

Fig.2: Simplified deployment possibility in customer environments enabling single approach during initial trial and production with simplified upgrades and scaling

For the solution architects out there, it is evident that this solution can be easily scaled to large deployments. We also have high flexibility in adapting it for customer infrastructure with deployment policies and provision of detailed documentation for self-deployment. We also enable anonymized usage analysis and fine tuning of the models within customer environments based on specific needs.

In Our Experience…

All of this makes a great user experience with TECHNIA Value Components even more impressive. It enables us to go beyond search and have smart workflows and collaboration components. This is a starting point for our journey towards using AI to boost productivity and collaboration for end users.

References

[1] K. Wagstaff, “Machine learning that matters,” arXiv preprint arXiv:1206.4656, 2012.

[2] I. Goodfellow, Y. Bulatov, J. Ibarz, S. Arnoud and V. Shet, “Multi-digit number recognition from street view imagery using deep convolutional neural networks,” arXiv preprint arXiv:1312.6082, 2013.

[3] “Citi® Payment Outlier Detection Launches in 90 Countries,” Citigroup Inc., 26 6 2019. [Online]. Available: https://www.citigroup.com/citi/news/2019/190626a.htm.

[4] M. Beaulieu, M. Gatford, X. Huang, S. Robertson, S. Walker and P. Williams, “Okapi at TREC-5,” Nist Special Publication SP, pp. 143-166, 1997.

[5] S. Connelly, “Practical BM25 – Part 2: The BM25 Algorithm and its Variables,” Elasticsearch B.V., 19 4 2018. [Online]. Available: https://www.elastic.co/blog/practical-bm25-part-2-the-bm25-algorithm-and-its-variables.

[6] T. Mikolov, K. Chen, G. Corrado and J. Dean, “Efficient estimation of word representations in vector space,” arXiv preprint arXiv:1301.3781, 2013.

[7] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. Gomez, Ł. Kaiser and I. Polosukhin, “Attention is all you need,” in Advances in neural information processing systems, 2017.

Continue the Conversation...

Join us at the TECHNIA Software PLM Innovation Forum to learn more about TECHNIA Value Components and more.

Register today
Previous
Can Artificial Intelligence Boost PLM Productivity and Collaboration?
Next
How to Accelerate 3DEXPERIENCE User Adoption
At TECHNIA, we pave the way for your innovation, creativity and profitability.

We combine industry-leading Product Lifecycle Management tools with specialist knowledge, so you can enjoy the journey from product concept to implementation. Our experience makes it possible to keep things simple, personal and accessible so that together we transform your vision into value.

Want to receive more content like this?
  • Related news and articles straight to your inbox
  • Hints, tips & how-tos
  • Thought leadership articles
How-to’s, hints & tips

Learn how to work better together with world-leading PLM knowledge that keeps your engineering design, simulation and manufacturing ahead of the curve.