Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add multi-layer, bidirectional RNN with tanh activation #28804

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Commits on Sep 4, 2023

  1. Create func_wrapper.py

    func_wrapper.py is a Python module designed to streamline the integration of Hugging Face Transformers into your natural language processing (NLP) projects. It provides a set of input and output conversion wrappers to simplify the process of passing data between your custom functions and Transformers' data structures.
    
    Input Conversion Wrappers:
    
    inputs_to_transformers_tensors: This wrapper converts input data (text, tensors, etc.) into Transformers-compatible data structures. It is particularly useful when your custom functions expect diverse input types.
    Output Conversion Wrappers:
    
    outputs_to_pytorch_tensors: After your custom function returns data, this wrapper ensures that the output data is converted into PyTorch tensors or other appropriate formats.
    Usage:
    
    Import func_wrapper.py into your project.
    Initialize a Hugging Face Transformers model and tokenizer.
    Wrap your custom function with to_transformers_tensors_and_back. This wrapped function can now accept and return Transformers-compatible data.
    Here's a simple example of how to use func_wrapper.py:
    
    
    import torch
    from transformers import BertForSequenceClassification, BertTokenizer
    from ivy.functional.frontends.transformers.func_wrapper import to_transformers_tensors_and_back
    
    # Initialize the model and tokenizer
    model_name = "bert-base-uncased"
    model = BertForSequenceClassification.from_pretrained(model_name)
    tokenizer = BertTokenizer.from_pretrained(model_name)
    
    # Wrap your custom function using the conversion wrappers
    wrapped_function = to_transformers_tensors_and_back(your_function, model, tokenizer)
    
    # Prepare sample input data
    sample_input_text = "This is a sample input text."
    sample_input_tensor = torch.rand((3, 3))
    
    # Call your wrapped function with the sample input data
    output = wrapped_function(sample_input_text, sample_input_tensor)
    
    # The output is automatically converted to PyTorch tensors
    print(output)
    
    Please note that func_wrapper.py is still in development, and further enhancements and refinements are expected. Your feedback and contributions to improve its functionality are welcome.
    muzakkirhussain011 committed Sep 4, 2023
    Configuration menu
    Copy the full SHA
    32e73bf View commit details
    Browse the repository at this point in the history

Commits on Aug 16, 2024

  1. Configuration menu
    Copy the full SHA
    485ded2 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    cf1ae37 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    5d130b5 View commit details
    Browse the repository at this point in the history

Commits on Aug 17, 2024

  1. Configuration menu
    Copy the full SHA
    e9f4c37 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    281d24d View commit details
    Browse the repository at this point in the history