SeedTS
Nearby businesses
Beborderless 71-75 Shelton Street Covent Garden
Shelton Street
71-75 Shelton Street, Covent Garden, London,
WC2H9JQ
Shelton Street
Shelton Street
WC2H9JQ
Shelton Street
WC2H9JQ
WC2H9
Shelton Street
WC2H9JQ
Wenlock Road
Shelton Street
We help companies to adopt innovative technologies like APIs, microservices and AI/ML. Transforming Companies with APIs, Microservices, and AI
Our experience allows us to offer robust solutions, in which we apply disruptive innovations to digitally transform our client’s business.
Venha participar do Tech-Talks SeedTs, a live que vai maximizar o potencial das APIs!
Nessa jornada pelo Apigee, vamos te mostrar como tirar o máximo de proveito dessa poderosa ferramenta de integração.
Sobre o Evento:
Título: Potencializando Maximizando o potencial das APIs: Uma Jornada pelo Apigee
Evento Gratuito
Data: 15/12/2023 das 10h à 12h30 (sexta feira)
Para se inscrever, acesse o link abaixo e preencha suas informações na página de inscrição, assim lembraremos você no dia.
Link: https://www.seedts.com/tech-talks-apigee
Nos últimos meses vimos diversos open sources de large language models (LLMs) como Llama 2, Falcon, Alpaca, entre outros, ganharem destaque pelo aumento da capacidade analítica.
Fizemos uma demonstração de como instalar o seu LLM em sua empresa, em sua própria infraestrutura e, assim, ter mais controle sobre seus dados.
A incorporação de LLM em sistemas corporativos tem sido um divisor de águas para muitas empresas em todo o mundo, pois revolucionam os processos de tomada de decisão e trazem mudanças significativas às operações de negócios, mais eficazes e orientados por dados.
Você está pronto para abraçar essa mudança? Vamos discutir o potencial de LLMs e da IA em sua operação.
Vídeo sobre LLM: https://ow.ly/lYTk50PYK4B
Acesso nosso site para saber mais: https://ow.ly/I8sz50PYK4A
Conte com nossa equipe altamente qualificada e experiente em Kafka, onde a sua empresa terá a tranquilidade e profissionais prontos para solucionar qualquer situação que surja (24x7) no uso do Confluent & Apache Kafka®, seja em cloud ou on-premise, assim como seus dados relacionados.
Nosso suporte abrange todas as dimensões associadas ao Kafka, independente da magnitude de sua infraestrutura.
A sua tranquilidade é a nossa prioridade, portanto, oferecemos um suporte abrangente do Kafka. Desde a instalação inicial, passando pela aplicação de patches e fornecendo orientações para os usuários avançados o uso do produto.
Com este foco, resultamos em um ambiente mais tranquilo e operacional para seu negócio no uso do Kafka.
Convidamos você a entrar em contato para descobrir mais sobre nossos planos de suporte Confluent & Apache Kafka®.
Garanta a tranquilidade de ter um serviço de alta qualidade à sua disposição em todos os momentos.
Acesse o link: https://ow.ly/Mu3q50PVFsQ
Estamos Ao Vivo 🎥
A Live é sobre Tecnologias Indispensáveis para Agilidade Organizacional.
Para assistir, acesse o link: https://www.youtube.com/watch?v=SD8WcofN060
Informações do Tech-Talks: https://www.seedts.com/tech-talks
Tech-Talk: Tecnologias indispensáveis para Agilidade Organizacional Business Agility tem dominado a agenda dos executivos atualmente. E a questão que sempre vem a tona é como incorporar novas tecnologias e modernizar o relaci...
O evento é hoje!
Não perca nossa live sobre Tecnologias Indispensáveis para Agilidade Organizacional.
Descubra como incorporar IA nos processos de trabalho, extrair dados e indicadores automaticamente e utilizar infraestrutura de TI como serviço para governança ágil.
Participe ao vivo e esteja atento às mudanças do mercado.
Hoje, às 11h.
Transmissão gratuita, não deixe de se inscrever para receber o link da live e saber de mais informações sobre.
Inscreva-se aqui: https://www.seedts.com/tech-talks
Não perca nossa live sobre Tecnologias Indispensáveis para Agilidade Organizacional.
Descubra como incorporar IA nos processos de trabalho, extrair dados e indicadores automaticamente e utilizar infraestrutura de TI como serviço para governança ágil.
A live será ao vivo e você poderá acompanhar e ficar atento às mudanças do mercado
Nosso Tech-Talks será no dia 16/06/2023 (sexta-feira), às 11h. Transmissão gratuita no YouTube.
Não deixe de se Inscrever através do link: https://ow.ly/IFcP50OPGqK
Want to know some of the AI applications? Check out some examples below!
AI is transforming our world and business. Not applying AI in your IT is the death of the business announced.
AI helps streamline processes and more efficiently manage the supply chain; easily track competitive prices in the market; automate the assessment of credit requirements faster than ever before. AI can also be used for sentiment analysis on text or voice-based data, transcribing videos or audios, providing a deeper view of the customer, among many other applications.
AI must be integrated into your IT architecture, taking advantage of moving data to create real-time business intelligence or even static data (database) to create more sophisticated strategies.
We help companies apply AI/ML, transform their architectures, watch your business transform! Get in touch to know more.
Link: https://www.seedts.co.uk/applied-ai-machine-learning
Do you know the particularities of the Cloud, Multi-Cloud or even Hybrid strategy? Read this post to learn more!
The Cloud strategy refers to the exclusive use of a public or private cloud from a single vendor, while the Multi-Cloud strategy involves the use of more than one public or private cloud, usually from different vendors.
A single-cloud strategy can be easier to manage and more cost-effective, while a multi-cloud strategy offers more flexibility and redundancy, allowing distribution of workloads across different cloud providers.
However, implementing a Multi-Cloud strategy can be more complex and require more technical skills to ensure integration and compatibility between different clouds. However, this is a strategy that allows better management of resources, increases high availability, flexibility and resilience, in addition to offering different price options for the same resource, which allows you to choose the cloud where the cost is lower for the same resource. service.
A third line is to have some on-premise services in your data center and others in the cloud. This composition, we call hybrid.
In the case of on-premise data centers, regardless of whether they are legacy or not, today there are already tools to manage such an infrastructure more easily, IaC has evolved, we even have AI helpers to create IaC... Well, this way (being hybrid) can allow you to save a lot. But, don't get confused, some services and if your company is not big, doing it in house can be su***de to your budget.
Either way, hybrid infrastructure is mainstream right now for large enterprises. Just don't faithfully believe that it's best to have everything in a single cloud or multiple cloud only, especially coming from such a speech from those who sell such services (Big Clouds-Azure, GC, AWS, etc.).
To learn more about the best multi-cloud or hybrid cloud strategy, including legacy transformation, come talk to us.
Link: https://www.seedts.co.uk/monolith-to-microservices
Governance of APIs and microservices is a super important topic for your SDLC to work properly, but it can be difficult to implement without clear guidance.
The Governance of APIs and microservices is essential to guarantee the quality of the services offered, create visibility of deliveries, in addition to reducing the risk of your project not going to the production environment or having a high technical debt already on delivery.
However, having automated governance integrated into the elements that will create your software, with clear policies and earned metrics, requires a significant effort on the part of an IT team and can take a long time to be fully implemented.
We help IT teams deploy automated, integrated governance quickly and flatten the governance maturity curve for APIs and microservices. Want to know more? Contact!
Link: https://www.seedts.co.uk/api-and-microservices-governance
Infrastructure is one of the fundamental pillars of any business, but maintaining and managing it can be expensive, difficult and time-consuming if done manually.
Every business needs to have a solid and dynamic infrastructure to operate, but this is not always easy or cheap. Also, it can be difficult to update or expand or even reduce it with the click of a button.
With Infrastructure as Code (IaC), you have full control, your infrastructure is written in code.
Rather than dealing with items manually, operations are governed by the same style of rules and constraints used in software development. An entire environment can be created and recreated exactly identically, whenever needed, at the click of a button.
Wanna know more about IaC? Chat with us.
Link: https://www.seedts.co.uk/iac-infrastructure-as-code
Find out by reading this post what are the main Machine Learning Clustering techniques!
The main Machine Learning Clustering techniques are K-Means, Hierarchical Clustering and Principal Components Analysis (PCA).
K-Means is used to group similar data into separate clusters. This assumes that the data has structure, that is, that certain data are related to each other and, therefore, can be grouped. K-Means seeks to perform a grouping of data in order to minimize the variance between the data of each cluster. This variance is calculated as the distance between the data points and the cluster centroid.
Hierarchical Clustering uses divide-and-conquer techniques to create a hierarchy of groups based on the distance between data points. This uses a top-down approach, starting with all the data being grouped into a single cluster and then subdividing the clusters into more specific clusters as the algorithm progresses.
Principal Components Analysis is used to reduce the number of variables in a dataset, helping to predict which groups the data may be related to. PCA is used to explore, visualize and analyze data and discover hidden patterns.
These three techniques are the most used and common in Machine Learning Clustering and they all have their certain accuracy. However, there are other techniques such as the Minimum Distance method and the Decision Tree that can also be used to obtain more accurate results. Now if you combine it with traditional data analysis techniques you can further improve the accuracy of the results...
Want to know more how to apply ML in your IT architecture? Chat with us.
Link: https://www.seedts.co.uk/applied-ai-machine-learning
What are the main mistakes to avoid in an API strategy? To find out, read our post below:
1. Not planning or understanding the real needs of an API strategy for the company: you must carefully assess what are the needs and objectives that the API has to meet even before its specification begins. Is it to create engagement? Is it to increase revenue? Is it creating partnerships?
2. Not having an API First approach to development. API First is a software development model that emphasizes building an API before you even start coding the application. The goal is to provide a secure and scalable interface to allow applications and services to communicate. The API is designed and developed first, before any application code is written. This allows developers to build applications and services that are portable, scalable, and secure.
Note: Here at SeedTS we even have an accelerator for creating microservices that have a Swagger as a starting point, API First style.
3. Develop APIs that were not correctly specified from the point of view of context, functionality and access (Owner): Does your API have everything or does it have almost nothing? Is it associated with a Product strategy? Is it within a domain context? Does she have an owner? Does your monitoring cover the business plus the technical part?
4. Ignoring security concerns: it is not possible to disregard the risks that access to APIs can bring, so it is necessary to invest in measures such as authentication, encryption and continuous monitoring of the platform. SQL injection attacks (delete....) or even payload size attacks can come from available APIs.
5. Not thinking about scalability: it is necessary to invest in the creation of APIs and Backends that are able to support a large number of requests and avoid possible performance problems when they start to be used. Logically, when you use an API manager, the Proxies APIs are the tip of the iceberg, and when used correctly, you can take advantage of policies to manage peak usage and number of calls, in the case of Apigee Spike Arrest and Quota policy .
Want to learn more about API strategy? Contact us.
Link: https://www.seedts.co.uk/api-strategy-done-the-right-way
Your business is growing and this expansion may be limited by the architecture of your legacy systems.
Therefore, invest in a new IT architecture to take advantage of the potential of your business!
Below are 4 tips that can help you modernize a legacy architecture:
Tip 1: Take a holistic approach, considering all the factors that will make the modernization successful, such as technical and business constraints.
Tip 2: Work with all teams, departments, business units and customers to understand these constraints and create your AS-IS map.
Tip 3: Build your to-be from a modern reference architecture, covering processes, methodologies, new technologies, security, infrastructure, etc.
Tip 4: Finally, have a documented reference architecture, with a reference environment and reference coding.
If you want to learn more about how we help companies modernize with leading architecture, contact us!
Link: https://www.seedts.co.uk/reference-architecture
How secure are your APIs?
Below are five very important tips to improve the security of your APIs.
1. Have consumer authentication and data transmission security:
- Preferably use strong authentication through tokens such as OAuth and OpenID Connect;
- Provide the developer with resources for two-factor authentication;
- Use encryption to store and transmit sensitive data. Have your architecture PCI compliant if possible.
2. Limits of use:
- Set usage limits for the API;
- Prevent an API consumer from making more than one request per second;
- Set behavioral limits for each user. In this, ML can help when integrated with an API manager in a more strategic line;
- Use IP blocking controls for suspicious access, especially when integrated with SIEM strategy.
3. Data validation:
- Validate the input data and format before processing requests, as bad data generate bad information or can generate instability in the environment;
- Utilize security policies that will prevent malicious requests from mixing with incoming data.
4. Security:
- Use security monitoring tools;
- Use vulnerability detection tools;
- Use a password manager;
- Make your APIs work with Service Mesh;
- Create security tests for the APIs.
5. Audit logs:
- Log all accesses to APIs;
- Monitor suspicious requests and also deploy anomaly detection in logs with AI\ML;
- Log API errors and failures.
Link: https://www.seedts.co.uk/apigee-consulting
Data Mesh is a standard for organizing data management within an organization, where a domain strategy plays a key role.
With decentralized data in domains, your teams will be able to apply IA\ML adhering to the needs of the domain, thus allowing for more specific solutions with greater added value.
Data Mesh also seeks to create awareness of the need to look at data seriously, with acculturated teams responsible for its management and governance.
Data Ownership, Data Governance, Data Catalog, Data Literacy, Data Democratization, Data Integration, Data Modeling, Data Quality, Data Security and Data Culture are important elements for deploying Data Mesh in a company.
To learn more about Data Mesh and its application in your domains, get in touch!
Link: https://www.seedts.co.uk/data-mesh
Do you want to find out the best practice for connecting with Kafka brokers? Read this post!
The best practice for connecting an Apache Kafka broker is to secure and standardize authentication, particularly when using strong security.
It is important to use strong passwords, SSL/TLS certificates for authenticating Kafka cluster participants and enable two-factor authentication if possible. Having a firewall to protect the Kafka broker against external attacks is basic, as well as continuously monitoring network traffic and system logs to detect intrusion attempts. Your Kafka must be well taken care of!
It's also important to configure security policies and keep users informed about security best practices through the reference architecture portal, I hope you have it 😊. Devs must know how to create their consumers and producers following these standards.
Kafka used as a central element of an architecture, where data is directed to all domains, security standardization must be designed within the security architecture and always updated to avoid possible vulnerabilities.
Finally, some questions for anyone working with Kafka to answer on the topic of security:
What security measures are in place to protect both internal and external access to Apache Kafka?
What security policies are used to ensure security in the consumption and production of data to Apache Kafka?
What additional security measures/tools to control access to the Apache Kafka system are used?
Want to know more about Security in Kafka? Chat with us!
Link: https://www.seedts.co.uk/event-kafka-in-real-time
Want to know how to improve Apache Kafka performance? Check out these 3 steps below and improve today!
1) Optimize message buffer size: Message buffer size is a setting in Kafka that controls the number and size of partitions messages are divided into. Setting a larger size for this buffer, allowing large amounts of data input, will increase the overall performance of Kafka.
2) Improve connection latency with brokers: Better connectivity between brokers and clients significantly increases Kafka's performance. Therefore, it is essential to perform latency optimization to improve overall performance.
3) Use a scalable architecture: A scalable architecture makes it easy to add new servers and topics as needed, allowing users to safely grow without directly impacting Kafka's performance. This can be achieved by using load balancing solutions to distribute the load across servers.
Visit our link below for more information!
Link: https://www.seedts.co.uk/event-kafka-in-real-time
Are you looking to integrate systems, have a single source of data and still apply AI within a modern IT architecture? Read the post 😊
Apache Kafka® can help by providing a unified, real-time platform for streaming data. Your data will be able to flow between domains and reach different systems in a unified way.
Its distributed and secure streaming architecture was designed to support extremely high traffic, as it has a scalable distributed messaging system that can handle millions of messages per second. This messaging system is based on a server cluster architecture, which guarantees high availability and fault tolerance.
Furthermore, Apache Kafka® has the replication engine to keep state consistent between servers and ensure that messages are delivered even in case of failure. This makes Apache Kafka® extremely reliable.
With Apache Kafka®, you can deliver contextualized, data-driven, AI-powered experiences to customers within a modern, secure, scalable, and robust architecture.
Chat with us to apply Apache Kafka to your legacy IT, with AI!
Link: https://www.seedts.co.uk/event-kafka-in-real-time
Do you want to know more about NLP? Check out some of their applications below!
The main purpose of NLP (Natural Language Processing) is to extract useful information from text, both in structured and unstructured form. For example, an NLP system can read a scientific article and extract keywords or key phrases, or even categorize the article into a certain area of study.
Furthermore, with the help of advanced computational linguistics techniques, it's possible to understand the context and meaning of certain expressions and texts, which facilitates the understanding of sentences by computers.
Another area where NLP has been heavily used is in the creation of artificial intelligence systems. For example, Cortana and Alexa virtual assistants use NLP to respond to voice commands.
Machine translation systems also rely on NLP to understand the meaning of words in different languages. To learn more, chat with SeedTS!
Link: https://www.seedts.co.uk/applied-ai-machine-learning
Find Out How Machine Learning Coding is Done!
The Machine Learning incentive process is based on data analysis and machine learning.
First, the data is analyzed to discover patterns or relationships between variables in the dataset.
These patterns are then used to create mathematical models that can be used to predict future outcomes.
These mathematical solutions are called machine learning algorithms.
Once an algorithm is developed, it is tested on a separate dataset to ensure model accuracy.
The algorithms are continually refined through trial and error using different combinations of the data present in the training set. Once a model has met the required levels of accuracy, it can be deployed into production.
This means that the algorithm is ready to process and respond to user input.
So this is how Machine Learning is coded! If you want to know more, chat with us!
Link: https://www.seedts.co.uk/applied-ai-machine-learning
How AI governance should ensure responsible use of data.
Governance applied to artificial intelligence (AI) ensures responsible use of data.
This should define the criteria for the use of data by AI algorithms.
It should also specify how decisions made by AI systems are monitored.
Should like to learn more about governance applied to AI? Contact us.
Link: https://www.seedts.co.uk/applied-ai-machine-learning
A infraestrutura é um dos pilares fundamentais de qualquer negócio, mas manter e gerenciá-la pode ser caro, difícil e demorado se feito manualmente.
Toda empresa precisa ter uma infraestrutura sólida e dinâmica para operar, mas isso nem sempre é fácil ou barato. Além disso, pode ser difícil de atualizá-la ou expandi-la ou mesmo reduzi-la com o clique de um botão.
Com a Infraestrutura como Código (IaC), você tem controle total, sua infra é escrita em código.
Em vez de lidar com itens manualmente, as operações são controladas pelo mesmo estilo de regras e restrições usados no desenvolvimento de software. Todo um ambiente pode ser criado e recriado de forma exatamente idêntica, sempre que necessário, num clique de um botão.
Acesse nosso site para saber mais: http://ow.ly/M4Qe50N0zAE
Descubra lendo este post quais são as principais técnicas de Clustering do Machine Learning!
As principais técnicas de Clustering do Machine Learning são o K-Means, o Agrupamento Hierárquico e a Análise de Componentes Principais (PCA).
O K-Means é usado para agrupar dados similares em clusters separados. Este assume que os dados possuem estrutura, ou seja, que determinados dados estão relacionados entre si e, portanto, podem ser agrupados. O K-Means procura realizar um agrupamento de dados de forma a minimizar a variância entre os dados de cada cluster. Essa variância é calculada como a distância entre os pontos de dados e o centroide do cluster.
Já o Agrupamento Hierárquico usa técnicas de divisão e conquista para criar uma hierarquia de grupos baseada na distância entre os pontos de dados. Este usa uma abordagem top-down, começando com todos os dados sendo agrupados em um único cluster e, em seguida, subdividindo-se os clusters em clusters mais específicos à medida que o algoritmo avança.
A Análise de Componentes Principais é usada para reduzir o número de variáveis num conjunto de dados, ajudando a prever quais grupos os dados podem estar relacionados. A PCA é usada para explorar, visualizar e analisar dados e descobrir padrões escondidos.
Estas três técnicas são as mais usadas e comuns em Clustering de Machine Learning e todas elas possuem sua determinada precisão. Contudo, existem outras técnicas como o método da Distância Mínima e o Árvore de Decisão que também podem ser usadas para obter resultados mais precisos. Agora se você combinar com técnicas de análise de dados tradicionais pode melhorar ainda mais a precisão dos resultados...
Site: http://ow.ly/siO550N0z9b
Quer descobrir a melhor prática de segurança para se conectar ao Kafka? Leia este post!
A melhor prática para conectar a um broker do Apache Kafka é proteger e padronizar a autenticação, com o uso de segurança forte.
Use senhas complexas, certificados SSL/TLS para conectar ao cluster Kafka e habilite a autenticação de dois fatores para os administradores da infraestrutura.
Tenha uma estratégia de autorização de domínios com o Kafka, para tal utilize RBAC em especial para trabalhar com schema registry, e service mesh para consumo e produção de dados aos tópicos.
Também é importante ter políticas de segurança claras, através do portal de arquitetura de referência e definidas com a Governança de Arquitetura, que esperamos que tenhas 😊. Neste ficam as políticas que vão explicar como os consumidores e produtores se conectam ao Broker e enviam dados de forma segura, como tratar dados sensíveis em movimento e modelos implementados.
Abaixo algumas perguntas para quem trabalha com Kafka sobre o tópico segurança:
- Quais medidas de segurança estão em vigor para proteger tanto o acesso interno quando ao externo ao Apache Kafka?
- Quais as políticas de segurança são usadas para garantir segurança no consumo e na produção de dados ao Apache Kafka?
- Quais medidas/ferramentas de segurança adicionais para controlar o acesso ao sistema Apache Kafka são utilizadas?
Isso para que o seu broker não fique aberto para o mundo e para todos, mesmo para aqueles que trabalham em domínios da sua empresa que utilizam o Kafka.
Também tenha a certeza de quem produz e consome dados de tópicos tem a autorização para tal e, os mesmos dados estão chegando e sendo entregues com a proteção devida, são os princípios básicos para que se tenha autorização e autenticação no canal de comunicação entre produtores, o Kafka, e seus consumidores.
Quer saber mais sobre Segurança no Kafka? Converse com a gente!
Acesse nosso site: http://ow.ly/Sk6k50N0y0V
Você está procurando integrar sistemas, ter uma fonte única de dados e ainda aplicar IA dentro de uma moderna arquitetura de TI? Leia o post 😊
O Apache Kafka® pode ajudar fornecendo uma plataforma unificada e em tempo real para streaming de dados. Seus dados poderão fluir entre domínios e chegar a diferentes sistemas de forma unificada.
Sua arquitetura de streaming distribuído e segura, foi projetada para suportar tráfego extremamente alto, pois possui um sistema de mensageria distribuído escalável que pode lidar com milhões de mensagens por segundo. Tal sistema de mensageria é baseado em uma arquitetura de cluster de servidores, que garante a alta disponibilidade e a tolerância a falhas.
Além disso, o Apache Kafka® possui o mecanismo de replicação para manter o estado consistente entre os servidores e garantir que as mensagens sejam entregues mesmo em caso de falha. Isso torna o Apache Kafka® extremamente confiável.
Com o Apache Kafka®, você pode oferecer aos clientes experiências contextualizadas com base em dados e usando IA, dentro que uma arquitetura moderna, segura, escalável e robusta.
Converse conosco para aplicar o Apache Kafka em sua TI legada, com IA!
Acesse o nosso site para mais informações: http://ow.ly/l7ex50MX8fb
Administrar e monitorar servidores Apigee em alta disponibilidade é uma tarefa complexa, que exige conhecimento específico de diversas áreas.
A falta de conhecimento pelo time de suporte pode comprometer a operação do Apigee e levar à perda de dados ou à indisponibilidade do sistema.
O Treinamento Apigee oferecido pela SeedTS aborda os aspectos de monitoramento e sustentação da plataforma, práticas avançadas de troubleshooting, failover e boas práticas de infraestrutura e rede, seja na Private Cloud ou On-Premise.
O aluno aprenderá a arquitetura do Apigee e seus módulos, bem como gerenciar e monitorar os servidores. Na parte de troubleshooting, os alunos também aprenderão a identificar, diagnosticar e corrigir rapidamente problemas comuns, além de técnicas em tempo de execução.
Aprimore seus conhecimentos para sustentar o Apigee em ambientes complexos. Clique no link para saber mais: http://ow.ly/uzgq50MX7H5
Click here to claim your Sponsored Listing.
Category
Telephone
Website
Address
71-75 Shelton Street, Covent Garden
London
WC2H9JQ
Opening Hours
Monday | 9am - 6pm |
Tuesday | 9am - 6pm |
Wednesday | 9am - 6pm |
Thursday | 9am - 6pm |
Friday | 9am - 6pm |
Ambition Business Centre
London, EN110FJ
Science sales jobs from one of the UK's leading recruiters of scientific sales people at all levels
118 Pall Mall
London, SW1Y5EA
Azrights is a Brand and IP consultancy offering support to perfect, protect and promote brands.
Piccadilly
London, W1J7BX
Executive Search & Talent Acquisition for Construction | Real Estate | Engineering | Digby Morris -
London
I am a Social Media Manager and Professional Blogger. See PeterTownsend.net for a more extensive po
Exmouth House, Unit EX200, 3/11 Pine Street
London, EC1R0JH
Where Ability Meets Agility.
365 Fulham Palace Road
London, SW66TA
Imagine you had the power to build your body to be, exactly how you want it to be. Would you do it?
London
PGD Strategy is an FCA authorised corporate finance, M & A and strategy consulting firm.
Cavendish Building, Gilbert Street
London, W1K5JH
Promoters, booking agents since 1992! Prodigy (theatro Vrahon) Pulse festival Digweed (Sef) Globedance Carl Cox, Deep Dish|>TXC Camel club, 50 cent, Busta and many others.